Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Accumulo >> mail # user >> Jobs failing with ClassNotFoundException


Copy link to this message
-
Re: Jobs failing with ClassNotFoundException
Is it possible that ToolRunner.run isn't working right? How might I
determine that it's putting the libs into the distributed cache?
--
Chris
On Thu, Feb 14, 2013 at 3:17 PM, Chris Sigman <[EMAIL PROTECTED]> wrote:

> All of those jars exist, and there aren't any differences in those from
> when I run one of the example jobs.  I'm also using ToolRunner.run.
>
>
> --
> Chris
>
>
> On Thu, Feb 14, 2013 at 2:34 PM, William Slacum <
> [EMAIL PROTECTED]> wrote:
>
>> Make sure that all of the jars you pass to libjars exist and you're using
>> ToolRunner.run, which will parse out those options.
>>
>>
>> On Thu, Feb 14, 2013 at 2:20 PM, Chris Sigman <[EMAIL PROTECTED]> wrote:
>>
>>> Yes, everything's readable by everyone.  As I said before, the odd thing
>>> is that running one of the example jobs like Wordcount work just fine.
>>>
>>>
>>> --
>>> Chris
>>>
>>>
>>> On Thu, Feb 14, 2013 at 2:17 PM, Keith Turner <[EMAIL PROTECTED]> wrote:
>>>
>>>> On Thu, Feb 14, 2013 at 1:53 PM, Chris Sigman <[EMAIL PROTECTED]>
>>>> wrote:
>>>> > Yep, all of the jars are also available on the datanodes
>>>>
>>>> Also are the jars readable by the user running the M/R job?
>>>>
>>>> >
>>>> >
>>>> > --
>>>> > Chris
>>>> >
>>>> >
>>>> > On Thu, Feb 14, 2013 at 1:51 PM, Billie Rinaldi <[EMAIL PROTECTED]>
>>>> wrote:
>>>> >>
>>>> >> On Thu, Feb 14, 2013 at 10:41 AM, Chris Sigman <[EMAIL PROTECTED]>
>>>> wrote:
>>>> >>>
>>>> >>> Hi everyone,
>>>> >>>
>>>> >>> I've got a job I'm running that I can't figure out why it's failing.
>>>> >>> I've tried running jobs from the examples, and they work just fine.
>>>>  I'm
>>>> >>> running the job via
>>>> >>>
>>>> >>> > ./bin/tool.sh ~/MovingAverage.jar movingaverage.MAJob inst
>>>> namenode
>>>> >>> > root pass stockdata movingaverage
>>>> >>>
>>>> >>> which I see is running the following exec call that seems perfect
>>>> to me:
>>>> >>>
>>>> >>> exec /usr/lib/hadoop/bin/hadoop jar /MovingAverage.jar
>>>> >>> movingaverage.MAJob -libjars
>>>> >>>
>>>> "/opt/accumulo/lib/libthrift-0.6.1.jar,/opt/accumulo/lib/accumulo-core-1.4.2.jar,/usr/lib/zookeeper//zookeeper-3.3.5-cdh3u5.jar,/opt/accumulo/lib/cloudtrace-1.4.2.jar,/opt/accumulo/lib/commons-collections-3.2.jar,/opt/accumulo/lib/commons-configuration-1.5.jar,/opt/accumulo/lib/commons-io-1.4.jar,/opt/accumulo/lib/commons-jci-core-1.0.jar,/opt/accumulo/lib/commons-jci-fam-1.0.jar,/opt/accumulo/lib/commons-lang-2.4.jar,/opt/accumulo/lib/commons-logging-1.0.4.jar,/opt/accumulo/lib/commons-logging-api-1.0.4.jar"
>>>> >>> inst namenode root pass tmpdatatable movingaverage
>>>> >>
>>>> >>
>>>> >> Does /opt/accumulo/lib/accumulo-core-1.4.2.jar exist on your hadoop
>>>> nodes,
>>>> >> specifically the one that's running the map?
>>>> >>
>>>> >> Billie
>>>> >>
>>>> >>
>>>> >>>
>>>> >>>
>>>> >>> but when the job runs, it gets to the map phase and fails:
>>>> >>>
>>>> >>> 13/02/14 13:25:26 INFO mapred.JobClient: Task Id :
>>>> >>> attempt_201301171408_0293_m_000000_0, Status : FAILED
>>>> >>> java.lang.RuntimeException: java.lang.ClassNotFoundException:
>>>> >>> org.apache.accumulo.core.client.mapreduce.AccumuloInputFormat
>>>> >>>     at
>>>> >>>
>>>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1004)
>>>> >>>     at
>>>> >>>
>>>> org.apache.hadoop.mapreduce.JobContext.getInputFormatClass(JobContext.java:205)
>>>> >>>     at
>>>> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:606)
>>>> >>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>>>> >>>     at org.apache.hadoop.mapred.Child$4.run(Child.java:266)
>>>> >>>     at java.security.AccessController.doPrivileged(Native Method)
>>>> >>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>> >>>     at
>>>> >>>
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1278)
>>>> >>>     at org.apache.hadoop.mapred.Child.main(Child.java:260)
>>>> >>> Caused by: java.lang.ClassNotFoundException:
>>>> >>> org.apache.accumulo.core.client.mapreduce.AccumuloInputFormat