Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
HBase >> mail # user >> Import HBase snapshots possible?


+
Siddharth Karandikar 2013-08-01, 13:35
+
Matteo Bertozzi 2013-08-01, 13:38
+
Siddharth Karandikar 2013-08-01, 13:45
+
Matteo Bertozzi 2013-08-01, 13:47
+
Siddharth Karandikar 2013-08-01, 13:54
+
Matteo Bertozzi 2013-08-01, 14:01
+
Siddharth Karandikar 2013-08-01, 14:09
+
Matteo Bertozzi 2013-08-01, 14:14
+
Siddharth Karandikar 2013-08-01, 14:25
Copy link to this message
-
Re: Import HBase snapshots possible?
You can't just copy the .snapshot folder... so now you've the RSs that is
failing since the files for the cloned table are not available..

when you specify the hbase.rootdir you've just to specify the hbase.rootdir
the one in /etc/hbase-site.xml which doesn't contain the name of the
snapshot/table that you want to export (e.g.
hdfs://10.209.17.88:9000/hbasenot hdfs://
10.209.17.88:9000/hbase/s2)

Matteo

On Thu, Aug 1, 2013 at 3:25 PM, Siddharth Karandikar <
[EMAIL PROTECTED]> wrote:

> Its failing with '//' as well as '///'. Error suggests that it needs local
> fs.
>
> 3 ///
> ssk01:~/siddharth/tools/hbase-0.95.1-hadoop1 # ./bin/hbase
> org.apache.hadoop.hbase.snapshot.ExportSnapshot
> -Dhbase.rootdir=hdfs:///10.209.17.88:9000/hbase/s2 -snapshot s2
> -copy-to /root/siddharth/tools/hbase-0.95.1-hadoop1/data/
> Exception in thread "main" java.io.IOException: Incomplete HDFS URI,
> no host: hdfs:///10.209.17.88:9000/hbase/s2
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:85)
>         at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>         at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>         at org.apache.hadoop.fs.Path.getFileSystem(Path.java:187)
>         at
> org.apache.hadoop.hbase.util.FSUtils.getRootDir(FSUtils.java:860)
>         at
> org.apache.hadoop.hbase.snapshot.ExportSnapshot.run(ExportSnapshot.java:594)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at
> org.apache.hadoop.hbase.snapshot.ExportSnapshot.innerMain(ExportSnapshot.java:690)
>         at
> org.apache.hadoop.hbase.snapshot.ExportSnapshot.main(ExportSnapshot.java:694)
>
> 2 //
>
> ssk01:~/siddharth/tools/hbase-0.95.1-hadoop1 # ./bin/hbase
> org.apache.hadoop.hbase.snapshot.ExportSnapshot
> -Dhbase.rootdir=hdfs://10.209.17.88:9000/hbase/s2 -snapshot s2
> -copy-to /root/siddharth/tools/hbase-0.95.1-hadoop1/data/
> Exception in thread "main" java.lang.IllegalArgumentException: Wrong
> FS: hdfs://10.209.17.88:9000/hbase/s2/.hbase-snapshot/s2/.snapshotinfo,
> expected: file:///
>         at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:381)
>         at
> org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:55)
>         at
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:393)
>         at
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:251)
>         at
> org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:125)
>         at
> org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:283)
>         at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:427)
>         at
> org.apache.hadoop.hbase.snapshot.SnapshotDescriptionUtils.readSnapshotInfo(SnapshotDescriptionUtils.java:296)
>         at
> org.apache.hadoop.hbase.snapshot.ExportSnapshot.getSnapshotFiles(ExportSnapshot.java:371)
>         at
> org.apache.hadoop.hbase.snapshot.ExportSnapshot.run(ExportSnapshot.java:618)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at
> org.apache.hadoop.hbase.snapshot.ExportSnapshot.innerMain(ExportSnapshot.java:690)
>         at
> org.apache.hadoop.hbase.snapshot.ExportSnapshot.main(ExportSnapshot.java:694)
>
>
>
>
> Btw, I tried one more thing. From my HDFS location, I just did a copy like
> -
> ssk01:~/siddharth/tools/hadoop-1.1.2 # ./bin/hadoop fs -copyToLocal
> hdfs://10.209.17.88:9000/hbase/s1/.hbase-snapshot/s1
> /root/siddharth/tools/hbase-0.95.1-hadoop1/data/.hbase-snapshot/
>
> After doing this, I am able to see s1 in 'list_snapshots'. But it is
> failing at 'clone_snapshot'.
>
> hbase(main):014:0> clone_snapshot 's1', 'ts1'
>
> ERROR: java.io.IOException: Table 'ts1' not yet enabled, after 199617ms.
+
Siddharth Karandikar 2013-08-02, 07:34
+
Jignesh Patel 2013-08-03, 09:55
+
Siddharth Karandikar 2013-08-01, 13:36
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB