Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> Bulkload Problem


Copy link to this message
-
Bulkload Problem
Hi,

I try to load a big amount of data into a hbase cluster. I've imported
successfully up to 3000 Millionen Datasets (KV Pairs). But if I try to
import 6000 Millionen I got this error after 60-95% of the import:
http://pastebin.com/CCp6kS3m ...

The System is not crashing or anything like this, All nodes are still up.
It seems to me that one node is temporarily not available. Maybe is it
possibel to increase the repeat-number? (I think its default 10). What
value do I have to change for that?
I'm using Cloudera 4.4.0-1 and the Hbase version 0.94.6-cdh4.4.0

regards,

john
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB