Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
HBase >> mail # user >> Frequent fail of bulkload


+
vbogdanovsky 2013-04-04, 07:56
Copy link to this message
-
Re: Frequent fail of bulkload
Did you ever find a resolution to this issue?

Thanks,
Nick
On Thu, Apr 4, 2013 at 12:56 AM, vbogdanovsky <[EMAIL PROTECTED]
> wrote:

> I have hfiles after MR-job and when I import them to my table I often get
> Exceptions like this:
> ==============================**=============> hadoop jar /usr/lib/hbase/hbase-0.92.1-**cdh4.1.1-security.jar
> completebulkload result/hfiles_dir mytable
> ==============================**=============> java.io.FileNotFoundException: File does not exist:
> /user/myname/result/hfiles_**dir/t/**a87347681d904c639f613e21938e06**69
>     at org.apache.hadoop.hdfs.server.**namenode.FSNamesystem.**
> getBlockLocationsUpdateTimes(**FSNamesystem.java:1239)
>     at org.apache.hadoop.hdfs.server.**namenode.FSNamesystem.**
> getBlockLocationsInt(**FSNamesystem.java:1192)
>     at org.apache.hadoop.hdfs.server.**namenode.FSNamesystem.**
> getBlockLocations(**FSNamesystem.java:1165)
>     at org.apache.hadoop.hdfs.server.**namenode.FSNamesystem.**
> getBlockLocations(**FSNamesystem.java:1147)
>     at org.apache.hadoop.hdfs.server.**namenode.NameNodeRpcServer.**
> getBlockLocations(**NameNodeRpcServer.java:383)
>     at org.apache.hadoop.hdfs.**protocolPB.**
> ClientNamenodeProtocolServerSi**deTranslatorPB.**getBlockLocations(**
> ClientNamenodeProtocolServerSi**deTranslatorPB.java:170)
>     at org.apache.hadoop.hdfs.**protocol.proto.**
> ClientNamenodeProtocolProtos$**ClientNamenodeProtocol$2.**
> callBlockingMethod(**ClientNamenodeProtocolProtos.**java:44064)
>     at org.apache.hadoop.ipc.**ProtobufRpcEngine$Server$**
> ProtoBufRpcInvoker.call(**ProtobufRpcEngine.java:453)
>     at org.apache.hadoop.ipc.RPC$**Server.call(RPC.java:898)
>     at org.apache.hadoop.ipc.Server$**Handler$1.run(Server.java:**1693)
>     at org.apache.hadoop.ipc.Server$**Handler$1.run(Server.java:**1689)
>     at java.security.**AccessController.doPrivileged(**Native Method)
>     at javax.security.auth.Subject.**doAs(Subject.java:396)
>     at org.apache.hadoop.security.**UserGroupInformation.doAs(**
> UserGroupInformation.java:**1332)
>     at org.apache.hadoop.ipc.Server$**Handler.run(Server.java:1687)
>
> Why is it happend?
> Sometimes the same hfiles import normal and sometimes with this errors.
> What should I do for elimination of this problem?
> Thanks.
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB