Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Error starting MRAppMaster


Copy link to this message
-
Re: Error starting MRAppMaster

This has got nothing to do with the scheduler.

I believe this has got to do with some compilation issue. How did you build hadoop?

Also, I found that the repo at github (which is a mirror of git repo at apache) doesn't always pick all the commits immediately. You are better off checking out from svn (http://hadoop.apache.org/common/version_control.html)

HTH,
+Vinod
On Jun 21, 2012, at 9:34 AM, Prajakta Kalmegh wrote:

> Okay, the below problem was fixed after commenting out the property to use
> CapacityScheduler from my yarn-site.xml. I found a reference to this in
> this JIRA <https://issues.apache.org/jira/browse/MAPREDUCE-4339>. YARN now
> uses the default FIFO scheduler.
>
> Is the Dispatcher used in the CapacityScheduler's (or any scheduler's)
> flow?
>
> Thanks in advance.
> Regards,
> Prajakta
>
>
>
>
>
> On Thu, Jun 21, 2012 at 10:53 AM, Prajakta Kalmegh <[EMAIL PROTECTED]>wrote:
>
>> Hi Jagat
>>
>> Thanks for your reply. I am not using Pig. I have the latest hadoop
>> running cloned from github trunk. Actually I was able to execute my
>> programs until I refreshed my github foked copy yesterday and ran a build
>> on it.  :(
>>
>> I figured out from a debug that the lines that are creating a problem are:
>> ----------------
>>   //service to log job history events
>>    EventHandler<JobHistoryEvent> historyService >>        createJobHistoryHandler(context);
>>
>> dispatcher.register(org.apache.hadoop.mapreduce.jobhistory.EventType.class,
>>        historyService);
>> ----------------
>> in the MRAppMaster.java class. The reason being the register method
>> from org.apache.hadoop.yarn.event.Dispatcher is not
>> accepting EventHandler<JobHistoryEvent> as the second argument. Not sure
>> why.
>>
>> Regards,
>> Prajakta
>>
>>
>>
>> On Thu, Jun 21, 2012 at 10:22 AM, Jagat Singh <[EMAIL PROTECTED]>wrote:
>>
>>> Seems you are using Pig with Hadoop 0.23 or 2.0 version.
>>>
>>> Can you quickly recompile pig with 23 option and try this again.
>>>
>>>
>>>
>>> On Thu, Jun 21, 2012 at 10:02 AM, Prajakta Kalmegh <[EMAIL PROTECTED]
>>>> wrote:
>>>
>>>> Hi
>>>>
>>>> I am getting the following error while trying to execute any example
>>>> (wordcount, terasort etc):
>>>> 12/06/21 09:52:39 INFO mapreduce.Job: Running job:
>>> job_1340251923324_0001
>>>> 12/06/21 09:52:45 INFO mapreduce.Job: Job job_1340251923324_0001
>>> running in
>>>> uber mode : false
>>>> 12/06/21 09:52:45 INFO mapreduce.Job:  map 0% reduce 0%
>>>> 12/06/21 09:52:45 INFO mapreduce.Job: Job job_1340251923324_0001 failed
>>>> with state FAILED due to: Application application_1340251923324_0001
>>> failed
>>>> 1 times due to AM Container for appattempt_1340251923324_0001_000001
>>> exited
>>>> with  exitCode: 1 due to:
>>>> .Failing this attempt.. Failing the application.
>>>> 12/06/21 09:52:45 INFO mapreduce.Job: Counters: 0
>>>> -----------------------------
>>>>
>>>> *The contents of containers logs are pasted below:*
>>>> 2012-06-21 09:52:43,856 INFO [main]
>>>> org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Created MRAppMaster for
>>>> application appattempt_1340251923324_0001_000001
>>>> 2012-06-21 09:52:44,625 FATAL [main]
>>>> org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Error starting
>>> MRAppMaster
>>>> java.lang.Error: Unresolved compilation problems:
>>>> The method register(Class<? extends Enum>, EventHandler) in the type
>>>> Dispatcher is not applicable for the arguments (Class<EventType>,
>>>> EventHandler<JobHistoryEvent>)
>>>> org.apache.hadoop.mapreduce.jobhistory.EventType cannot be resolved to
>>> a
>>>> type
>>>>
>>>> at
>>>>
>>> org.apache.hadoop.mapreduce.v2.app.MRAppMaster.init(MRAppMaster.java:261)
>>>> at
>>>>
>>> org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.run(MRAppMaster.java:1049)
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>> at
>>>>
>>>>
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)