Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive >> mail # user >> Hive 12 with Hadoop 2.x with ORC


Copy link to this message
-
Re: Hive 12 with Hadoop 2.x with ORC
protobuf 2.5 upgrade did not get included in hive 0.12 (HIVE-5112).
You might want to apply the protobuf update patch on top of 0.12 to
use it with recent versions of hadoop 2.x . (but i am certain if this
is a protobuf version issue).
On Tue, Oct 22, 2013 at 6:53 AM, Rajesh Balamohan
<[EMAIL PROTECTED]> wrote:
> Hi All,
>
> When running Hive 12 with Hadoop 2.x with ORC, I get the following error
> while converting a table with text file to ORC format table.  Any help will
> be greatly appreciated
>
> 2013-10-22 06:50:49,563 WARN [main] org.apache.hadoop.mapred.YarnChild:
> Exception running child : java.lang.RuntimeException: Hive Runtime Error
> while closing operators
> at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:240)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> Caused by: java.lang.UnsupportedOperationException: This is supposed to be
> overridden by subclasses.
> at
> com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
> at
> org.apache.hadoop.hive.ql.io.orc.OrcProto$ColumnStatistics.getSerializedSize(OrcProto.java:3046)
> at
> com.google.protobuf.CodedOutputStream.computeMessageSizeNoTag(CodedOutputStream.java:749)
> at
> com.google.protobuf.CodedOutputStream.computeMessageSize(CodedOutputStream.java:530)
> at
> org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndexEntry.getSerializedSize(OrcProto.java:4129)
> at
> com.google.protobuf.CodedOutputStream.computeMessageSizeNoTag(CodedOutputStream.java:749)
> at
> com.google.protobuf.CodedOutputStream.computeMessageSize(CodedOutputStream.java:530)
> at
> org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndex.getSerializedSize(OrcProto.java:4641)
> at
> com.google.protobuf.AbstractMessageLite.writeTo(AbstractMessageLite.java:75)
> at
> org.apache.hadoop.hive.ql.io.orc.WriterImpl$TreeWriter.writeStripe(WriterImpl.java:548)
> at
> org.apache.hadoop.hive.ql.io.orc.WriterImpl$StructTreeWriter.writeStripe(WriterImpl.java:1328)
> at
> org.apache.hadoop.hive.ql.io.orc.WriterImpl.flushStripe(WriterImpl.java:1699)
> at org.apache.hadoop.hive.ql.io.orc.WriterImpl.close(WriterImpl.java:1868)
> at
> org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat$OrcRecordWriter.close(OrcOutputFormat.java:95)
> at
> org.apache.hadoop.hive.ql.exec.FileSinkOperator$FSPaths.closeWriters(FileSinkOperator.java:181)
> at
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.closeOp(FileSinkOperator.java:866)
> at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:596)
> at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:613)
> at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:613)
> at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:613)
> at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:207)
> ... 8 more
>
>
>
>
>
> --
> ~Rajesh.B

--
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.