Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/app/MRAppMaste


Copy link to this message
-
Re: java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/app/MRAppMaste
Hi,

Is it expected to set the yarn application.classpath to:
/usr/local/hadoop/etc/hadoop,/usr/local/hadoop/share/hadoop/mapreduce/*,/usr/local/hadoop/share/hadoop/mapreduce/lib/*,/usr/local/hadoop/share/hadoop/common/*,/usr/local/hadoop/share/hadoop/common/lib/*,/usr/local/hadoop/share/hadoop/hdfs/*,/usr/local/hadoop/share/hadoop/hdfs/lib*

I am trying to run the application not from the cluster. Are there any specific settings needs to be done in Cluster so that I can go ahead with default yarn application.classpath?

Regards,
Subroto Sanyal
On Jun 5, 2012, at 12:25 PM, Subroto wrote:

> Hi Deva,
>
> Tried the yarn application path with absolute values. Still it didn't work.  It failed with same stack trace:-(
> Now the value of yarn.application.classpath was:
> /usr/local/hadoop/etc/hadoop,/usr/local/hadoop/*,/usr/local/hadoop/lib/*,/usr/local/hadoop/*,/usr/local/hadoop/lib/*,/usr/local/hadoop/*,/usr/local/hadoop/lib/*,/usr/local/hadoop/*,/usr/local/hadoop/*
>
> Cheers,
> Subroto Sanyal
> On Jun 5, 2012, at 12:07 PM, Devaraj k wrote:
>
>> Hi Subroto,
>>  
>>     It will not use yarn-env.sh for launching the application master. NM uses the environment set by the client for launching application master.  Can you set the environment variables in /etc/profile or update the yarn application classpath with the absolute paths.
>>  
>>  
>> Thanks
>> Devaraj
>> From: Subroto [[EMAIL PROTECTED]]
>> Sent: Tuesday, June 05, 2012 2:25 PM
>> To: [EMAIL PROTECTED]
>> Subject: Re: java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/app/MRAppMaste
>>
>> Hi Deva,
>>
>> Thanks for your response.
>> The file etc/hadoop/yarn-env.sh has the following entries:
>> export HADOOP_MAPRED_HOME=/usr/local/hadoop
>> export HADOOP_COMMON_HOME=/usr/local/hadoop
>> export HADOOP_HDFS_HOME=/usr/local/hadoop
>> export YARN_HOME=/usr/local/hadoop
>> export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
>> export YARN_CONF_DIR=$HADOOP_CONF_DIR
>>
>>
>> Is it expected to have these variables in profile file of the Linux user??
>>
>> I am not using Windows client. My client is running on Mac and the cluster is running on Linux versions.
>>
>> Cheers,
>> Subroto Sanyal
>> On Jun 5, 2012, at 10:50 AM, Devaraj k wrote:
>>
>>> Can you check all the hadoop environment variables are set properly in which the app master is getting launching.
>>>  
>>>  
>>> If you are submitting from windows, this might be the issue https://issues.apache.org/jira/browse/MAPREDUCE-4052.
>>>  
>>> Thanks
>>> Devaraj
>>> From: Subroto [[EMAIL PROTECTED]]
>>> Sent: Tuesday, June 05, 2012 2:14 PM
>>> To: [EMAIL PROTECTED]
>>> Subject: java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/app/MRAppMaste
>>>
>>> Hi,
>>>
>>> While running MR Jobs over a yarn cluster I  keep on getting:
>>> Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/v2/app/MRAppMaster
>>> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.v2.app.MRAppMaster
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>>> Could not find the main class: org.apache.hadoop.mapreduce.v2.app.MRAppMaster.  Program will exit.
>>>
>>> My client is running from a different environment from where the cluster is running.
>>> If I submit a job from the cluster environment; it runs successfully.
>>>
>>> I have verified the property yarn.application.classpath before submitting it from the client. The value is set to:
>>> $HADOOP_CONF_DIR,$HADOOP_COMMON_HOME/*,$HADOOP_COMMON_HOME/lib/*,$HADOOP_HDFS_HOME/*,$HADOOP_HDFS_HOME/lib/*,$HADOOP_MAPRED_HOME/*,$HADOOP_MAPRED_HOME/lib/*,$YARN_HOME/*,$YARN_HOME/lib/*