Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Hive >> mail # user >> Hive Runtime Error while closing operators


+
柏健 2013-02-17, 04:29
Copy link to this message
-
Re: Hive Runtime Error while closing operators
may be you are hitting this

https://issues.apache.org/jira/browse/HIVE-3218
On Sun, Feb 17, 2013 at 9:59 AM, 柏健 <[EMAIL PROTECTED]> wrote:

> hi , I write a hive like :
> hive -e "
> set hive.exec.compress.output=true;
> set
> mapred.output.compression.codec=org.apache.hadoop.io.compress.GzipCodec;
> set hive.merge.mapredfiles=true;
> set mapred.job.name=my test job;
> INSERT OVERWRITE DIRECTORY '/user/test/$year/$month/$day/'
> select *
> from
>     test.table
> where
>     hostname = 'localhost'
>     and substr(request_uri, 0, 4) = '/test/'
> );"
>
> I run this hive every day and when it run  it is splited into to two map
> reduce job.
> But this job failed sometimes, with below errors:
>
> java.lang.RuntimeException: Hive Runtime Error while closing operators
> at org.apache.hadoop.hive.ql.exec.ExecMapper.close(ExecMapper.java:226)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to rename output from: hdfs://dbbak10-002:8020/tmp/hive-hadoop/hive_2013-02-17_00-10-09_598_42232731534306869/_task_tmp.-ext-10001/_tmp.000002_0 to: hdfs://dbbak10-002:8020/tmp/hive-hadoop/hive_2013-02-17_00-10-09_598_42232731534306869/_tmp.-ext-10001/000002_0.gz
> at org.apache.hadoop.hive.ql.exec.FileSinkOperator$FSPaths.commit(FileSinkOperator.java:199)
> at org.apache.hadoop.hive.ql.exec.FileSinkOperator$FSPaths.access$300(FileSinkOperator.java:101)
> at org.apache.hadoop.hive.ql.exec.FileSinkOperator.closeOp(FileSinkOperator.java:718)
> at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:557)
> at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:566)
> at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:566)
> at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:566)
> at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:566)
> at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:566)
> at org.apache.hadoop.hive.ql.exec.ExecMapper.close(ExecMapper.java:193)
> ... 8 more
>
> And this error was happeded accidentally.
>
> I don't know why, If who know, please tell me , thank you very much.
>
>
--
Nitin Pawar
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB