Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Avro >> mail # user >> SpecificData.deepCopy exception with CDH4


Copy link to this message
-
Re: SpecificData.deepCopy exception with CDH4
It looks if I execute function
*setUserClassesTakesPrecedence(true) *
on my job configuration this sets the variable
"mapreduce.task.classpath.user.precedence" to true which causes the jars I
supply to appear first on the classpath.

Now I'm getting a different exception
12/07/16 15:17:41 INFO mapred.JobClient: Task Id :
attempt_201207160219_0016_r_000000_1, Status : FAILED
java.io.IOException: The temporary job-output directory
hdfs://hadoop00000/users/jlewi/staph/assembly/QuickMerge/_temporary doesn't
exist!
but I think this is an unrelated issue and I'm hoping could be because I'm
out of space.

J

On Mon, Jul 16, 2012 at 12:17 AM, Jeremy Lewi <[EMAIL PROTECTED]> wrote:

> I printed out my classpath in the configure function of the mapper and
> reducer it looks like the jars in /usr/lib/hadoop/lib are still appearing
> first.  So I must not be correctly setting the option to make my classpath
> first.
>
> Any ideas what I might be doing wrong?
> J
>
>
> On Sun, Jul 15, 2012 at 11:34 PM, Jeremy Lewi <[EMAIL PROTECTED]> wrote:
>
>> Thanks Alan.
>>
>> I'm still getting the same error as before. Here's how I'm running the job
>> *
>>
>>    -
>>
>> *
>> hadoop jar ./target/contrail-1.0-SNAPSHOT-job.jar
>> contrail.avro.QuickMergeAvro -D mapreduce.task.classpath.first=true
>> -libjars=/users/jlewi/svn_avro_1.6.1/lang/java/avro/target/avro-1.6.1.jar,/users/jlewi/svn_avro_1.6.1/lang/java/mapred/target/avro-mapred-1.6.1.jar
>> --inputpath=/users/jlewi/staph/assembly/BuildGraph
>> --outputpath=/users/jlewi/staph/assembly/QuickMerge --K=45
>>
>> I verified via the job tracker that the property
>> "mapreduce.task.classpath.first"  is getting picked up.
>>
>> It looks like the problem I'm dealing with is related to
>> https://issues.apache.org/jira/browse/AVRO-1103.
>>
>> Any ideas?
>>
>> Thanks
>> J
>>
>> On Sun, Jul 15, 2012 at 2:00 AM, Alan Miller <[EMAIL PROTECTED]>wrote:
>>
>>> Hi Just a quick idea.
>>> Also check ALL directories returned by
>>>   hadoop classpath
>>> for any Avro related classes.
>>>
>>> I was struggling trying to use
>>> avro-1.7.0 with CDH4 but made it work
>>> by using the -libjars option and making sure my classes are used BEFORE
>>> the standard classes. There's a config
>>> property (dont remember) to set for
>>> that. Note the above setting is for the
>>> task's classpath, to control the
>>> classpath of your driver class set
>>> HADOOOP_CLASSPATH=... and
>>> HADOOOP_USER_CLASSPATH_FIRST=true
>>>
>>> Alan
>>>
>>> Sent from my iPhone
>>>
>>> On Jul 15, 2012, at 3:59, "Jeremy Lewi" <[EMAIL PROTECTED]> wrote:
>>>
>>> > hi avro-users,
>>> >
>>> > I'm getting the following exception when using avro 1.6.1 with CDH4.
>>> > java.lang.NoSuchMethodError:
>>> org.apache.avro.specific.SpecificData.deepCopy(Lorg/apache/avro/Schema;Ljava/lang/Object;)Ljava/lang/Object;
>>> >
>>> > The offending code is
>>> > GraphNodeData copy = (GraphNodeData)
>>> SpecificData.get().deepCopy(data.getSchema(), data);
>>> >
>>> > where GraphNodeData is a class generated from my AVRO record.
>>> >
>>> > The code runs just fine on CDH3. I tried rebuilding AVRO from source
>>> and installing it my local repo because of a previous post that said Avro
>>> 1.6.1 in maven had been built against CDH3. I also deleted all the avro jar
>>> files I found in
>>> > /usr/lib/hadoop
>>> >
>>> > Any ideas? Thanks?
>>> > Jeremy
>>> >
>>> >
>>> >
>>>
>>
>>
>