Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> Import HBase snapshots possible?


Copy link to this message
-
Re: Import HBase snapshots possible?
We have two requirements
1. Change the name of table1
2. Modify primary key of the table2.

Table and table 2 are not intern linked(those are different db).
Can we use above mention export and import functionality in both cases.
On Fri, Aug 2, 2013 at 3:34 AM, Siddharth Karandikar <
[EMAIL PROTECTED]> wrote:

> Hi Matteo,
>
> Thanks a lot for all your help and detailed explanation. I could
> finally get it running.
>
> I am now running it like -
>     ./bin/hbase org.apache.hadoop.hbase.snapshot.ExportSnapshot
>         -Dfs.default.name=hdfs://10.209.17.88:9000/
>         -Dhbase.rootdir=hdfs://10.209.17.88:9000/hbase-ss/
>         -snapshot s1
>         -copy-to /hbase/
>
> My HBase is running with hbase.rootdir=hdfs://10.209.17.88:9000/hbase.
>
> See how I am specifying fs.default.name now. Without that it used to
> fail for source location with - "IllegalArgumentException: Wrong FS:
> hdfs://10.209.17.88:9000/hbase-ss/s2/.hbase-snapshot/s2/.snapshotinfo
> expected: file:///"
> I am still understanding how Configuration, FS type and checkPath are
> working underneath, but for now things are working for me.
>
> Thanks again.
>
> Siddharth
>
>
> On Thu, Aug 1, 2013 at 8:14 PM, Matteo Bertozzi <[EMAIL PROTECTED]>
> wrote:
> > You can't just copy the .snapshot folder... so now you've the RSs that is
> > failing since the files for the cloned table are not available..
> >
> > when you specify the hbase.rootdir you've just to specify the
> hbase.rootdir
> > the one in /etc/hbase-site.xml which doesn't contain the name of the
> > snapshot/table that you want to export (e.g.
> > hdfs://10.209.17.88:9000/hbasenot hdfs://
> > 10.209.17.88:9000/hbase/s2)
> >
> >
> >
> > Matteo
> >
> >
> >
> > On Thu, Aug 1, 2013 at 3:25 PM, Siddharth Karandikar <
> > [EMAIL PROTECTED]> wrote:
> >
> >> Its failing with '//' as well as '///'. Error suggests that it needs
> local
> >> fs.
> >>
> >> 3 ///
> >> ssk01:~/siddharth/tools/hbase-0.95.1-hadoop1 # ./bin/hbase
> >> org.apache.hadoop.hbase.snapshot.ExportSnapshot
> >> -Dhbase.rootdir=hdfs:///10.209.17.88:9000/hbase/s2 -snapshot s2
> >> -copy-to /root/siddharth/tools/hbase-0.95.1-hadoop1/data/
> >> Exception in thread "main" java.io.IOException: Incomplete HDFS URI,
> >> no host: hdfs:///10.209.17.88:9000/hbase/s2
> >>         at
> >>
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:85)
> >>         at
> >> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
> >>         at
> org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> >>         at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
> >>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
> >>         at org.apache.hadoop.fs.Path.getFileSystem(Path.java:187)
> >>         at
> >> org.apache.hadoop.hbase.util.FSUtils.getRootDir(FSUtils.java:860)
> >>         at
> >>
> org.apache.hadoop.hbase.snapshot.ExportSnapshot.run(ExportSnapshot.java:594)
> >>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >>         at
> >>
> org.apache.hadoop.hbase.snapshot.ExportSnapshot.innerMain(ExportSnapshot.java:690)
> >>         at
> >>
> org.apache.hadoop.hbase.snapshot.ExportSnapshot.main(ExportSnapshot.java:694)
> >>
> >> 2 //
> >>
> >> ssk01:~/siddharth/tools/hbase-0.95.1-hadoop1 # ./bin/hbase
> >> org.apache.hadoop.hbase.snapshot.ExportSnapshot
> >> -Dhbase.rootdir=hdfs://10.209.17.88:9000/hbase/s2 -snapshot s2
> >> -copy-to /root/siddharth/tools/hbase-0.95.1-hadoop1/data/
> >> Exception in thread "main" java.lang.IllegalArgumentException: Wrong
> >> FS: hdfs://10.209.17.88:9000/hbase/s2/.hbase-snapshot/s2/.snapshotinfo,
> >> expected: file:///
> >>         at
> org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:381)
> >>         at
> >>
> org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:55)
> >>         at
> >>
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:393)
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB