Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop, mail # user - Invocation exception


Copy link to this message
-
Re: Invocation exception
Mohit Anchlia 2012-02-29, 23:49
Thanks for the example. I did look at the logs and also at the admin page
and all I see is the exception that I posted initially.

I am not sure why adding an extra jar to the classpath in DistributedCache
causes that exception. I tried to look at Configuration code in hadoop.util
package but it doesn't tell much. It looks like it's throwing on this line
"configureMethod.invoke(theObject, conf);" in below code.
*private* *static* *void* setJobConf(Object theObject, Configuration conf) {

//If JobConf and JobConfigurable are in classpath, AND

//theObject is of type JobConfigurable AND

//conf is of type JobConf then

//invoke configure on theObject

*try* {

Class<?> jobConfClass
conf.getClassByName("org.apache.hadoop.mapred.JobConf");

Class<?> jobConfigurableClass
conf.getClassByName("org.apache.hadoop.mapred.JobConfigurable");

*if* (jobConfClass.isAssignableFrom(conf.getClass()) &&

jobConfigurableClass.isAssignableFrom(theObject.getClass())) {

Method configureMethod
jobConfigurableClass.getMethod("configure", jobConfClass);

configureMethod.invoke(theObject, conf);

}

} *catch* (ClassNotFoundException e) {

//JobConf/JobConfigurable not in classpath. no need to configure

} *catch* (Exception e) {

*throw* *new* RuntimeException("Error in configuring object", e);

}

}

On Tue, Feb 28, 2012 at 9:25 PM, Harsh J <[EMAIL PROTECTED]> wrote:

> Mohit,
>
> If you visit the failed task attempt on the JT Web UI, you can see the
> complete, informative stack trace on it. It would point the exact line
> the trouble came up in and what the real error during the
> configure-phase of task initialization was.
>
> A simple attempts page goes like the following (replace job ID and
> task ID of course):
>
>
> http://host:50030/taskdetails.jsp?jobid=job_201202041249_3964&tipid=task_201202041249_3964_m_000000
>
> Once there, find and open the "All" logs link to see stdout, stderr,
> and syslog of the specific failed task attempt. You'll have more info
> sifting through this to debug your issue.
>
> This is also explained in Tom's book under the title "Debugging a Job"
> (p154, Hadoop: The Definitive Guide, 2nd ed.).
>
> On Wed, Feb 29, 2012 at 1:40 AM, Mohit Anchlia <[EMAIL PROTECTED]>
> wrote:
> > It looks like adding this line causes invocation exception. I looked in
> > hdfs and I see that file in that path
> >
>  > DistributedCache.*addFileToClassPath*(*new* Path("/jars/common.jar"),
> conf);
> >
> > I have similar code for another jar
> > "DistributedCache.*addFileToClassPath*(*new* Path("/jars/analytics.jar"),
> > conf);" but this works just fine.
> >
> >
> > On Tue, Feb 28, 2012 at 11:44 AM, Mohit Anchlia <[EMAIL PROTECTED]
> >wrote:
> >
> >> I commented reducer and combiner both and still I see the same
> exception.
> >> Could it be because I have 2 jars being added?
> >>
> >>  On Mon, Feb 27, 2012 at 8:23 PM, Subir S <[EMAIL PROTECTED]
> >wrote:
> >>
> >>> On Tue, Feb 28, 2012 at 4:30 AM, Mohit Anchlia <[EMAIL PROTECTED]
> >>> >wrote:
> >>>
> >>> > For some reason I am getting invocation exception and I don't see any
> >>> more
> >>> > details other than this exception:
> >>> >
> >>> > My job is configured as:
> >>> >
> >>> >
> >>> > JobConf conf = *new* JobConf(FormMLProcessor.*class*);
> >>> >
> >>> > conf.addResource("hdfs-site.xml");
> >>> >
> >>> > conf.addResource("core-site.xml");
> >>> >
> >>> > conf.addResource("mapred-site.xml");
> >>> >
> >>> > conf.set("mapred.reduce.tasks", "0");
> >>> >
> >>> > conf.setJobName("mlprocessor");
> >>> >
> >>> > DistributedCache.*addFileToClassPath*(*new*
> Path("/jars/analytics.jar"),
> >>> > conf);
> >>> >
> >>> > DistributedCache.*addFileToClassPath*(*new* Path("/jars/common.jar"),
> >>> > conf);
> >>> >
> >>> > conf.setOutputKeyClass(Text.*class*);
> >>> >
> >>> > conf.setOutputValueClass(Text.*class*);
> >>> >
> >>> > conf.setMapperClass(Map.*class*);
> >>> >
> >>> > conf.setCombinerClass(Reduce.*class*);
> >>> >
> >>> > conf.setReducerClass(IdentityReducer.*class*);