Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
HDFS, mail # user - Re: UnsupportedOperationException occurs with Hadoop-2.1.0-beta jar files


Copy link to this message
-
Re: UnsupportedOperationException occurs with Hadoop-2.1.0-beta jar files
Vinayakumar B 2013-09-10, 15:06
Yes. . Protobuf 2.5 jars wants every Protobuf code in its jvm to be
generated and compiled using 2.5. Its not supporting old compiled code.
Even though there will not be any compilation issues with 2.4 generated
code, exception will be thrown at runtime.

So upgrade all your code to 2.5 and generate code using 2.5 Protobuf.
.compile and run again.

Regards,
Vinayakumar B
On Sep 10, 2013 8:58 AM, "sam liu" <[EMAIL PROTECTED]> wrote:

> This is an env issue. Hadoop-2.10-beta upgraded protobuf to 2.5 from
> 2.4.1, but the version of  protobuf  in my env is still 2.4.1, so the sqoop
> unit tests failed on my env. After I upgraded my protobuf to 2.5, all sqoop
> unit tests passed.
>
>
> 2013/9/9 sam liu <[EMAIL PROTECTED]>
>
>> Hi,
>>
>> With Hadoop-2.1.0-beta jar files, Sqoop-1.4.3 test TestAllTables failed
>> by exception UnsupportedOperationException, however it works with
>> Hadoop-2.0.4-alpha jar files. Below are the details. Any comments? Thanks!
>>
>> 4202 [main] INFO org.apache.hadoop.mapreduce.JobSubmitter - Cleaning up
>> the staging area
>> file:/home/hadoop/gitsqoop/sqoop/build/test/hadoop/mapred/staging/hadoop1395102746/.staging/job_local1395102746_0001
>> 4203 [main] DEBUG org.apache.sqoop.util.ClassLoaderStack - Restoring
>> classloader: sun.misc.Launcher$AppClassLoader@3f363f36
>> 4203 [main] ERROR org.apache.sqoop.Sqoop - Got exception running Sqoop:
>> java.lang.UnsupportedOperationException: This is supposed to be overridden
>> by subclasses.
>> 4204 [main] ERROR com.cloudera.sqoop.testutil.ImportJobTestCase - Got
>> exception running Sqoop: java.lang.RuntimeException:
>> java.lang.UnsupportedOperationException: This is supposed to be overridden
>> by subclasses.
>> 4235 [main] WARN com.cloudera.sqoop.testutil.BaseSqoopTestCase - Can't
>> delete /home/hadoop/gitsqoop/sqoop/build/test/data/sqoop/warehouse
>> ]]></system-out>
>> <system-err><![CDATA[[Server@3a5e3a5e]: [Thread[HSQLDB Server
>> @3a5e3a5e,5,main]]: run()/openServerSocket():
>> java.net.BindException: Address already in use
>> at java.net.PlainSocketImpl.socketBind(Native Method)
>> at java.net.PlainSocketImpl.bind(PlainSocketImpl.java:413)
>> at java.net.ServerSocket.bind(ServerSocket.java:339)
>> at java.net.ServerSocket.<init>(ServerSocket.java:205)
>> at java.net.ServerSocket.<init>(ServerSocket.java:117)
>> at org.hsqldb.HsqlSocketFactory.createServerSocket(Unknown Source)
>> at org.hsqldb.Server.openServerSocket(Unknown Source)
>> at org.hsqldb.Server.run(Unknown Source)
>> at org.hsqldb.Server.access$000(Unknown Source)
>> at org.hsqldb.Server$ServerThread.run(Unknown Source)
>> Note:
>> /home/hadoop/gitsqoop/sqoop/build/test/data/sqoop-hadoop/compile/deaf162bd3f0ca30e6034d74f6909791/IMPORT_TABLE_1.java
>> uses or overrides a deprecated API.
>> Note: Recompile with -Xlint:deprecation for details.
>> java.lang.UnsupportedOperationException: This is supposed to be
>> overridden by subclasses.
>> at
>> com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
>> at
>> org.apache.hadoop.yarn.proto.YarnProtos$URLProto.hashCode(YarnProtos.java:5487)
>> at
>> org.apache.hadoop.yarn.proto.YarnProtos$LocalResourceProto.hashCode(YarnProtos.java:6167)
>> at
>> org.apache.hadoop.yarn.api.records.impl.pb.LocalResourcePBImpl.hashCode(LocalResourcePBImpl.java:62)
>> at java.util.HashMap.hash(HashMap.java:132)
>> at java.util.HashMap.putImpl(HashMap.java:695)
>> at java.util.HashMap.put(HashMap.java:680)
>> at
>> org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:139)
>> at
>> org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:155)
>> at
>> org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:634)
>> at
>> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:415)
>> at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
>> at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
>> at java.security.AccessController.doPrivileged(AccessController.java:310)