Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce, mail # user - common error in map tasks


Copy link to this message
-
Re: common error in map tasks
姚吉龙 2013-04-23, 00:28
mainly it is caused by java.child.opt and the num of map task.

Sent from Mailbox for iPhone

On Tue, Apr 23, 2013 at 6:15 AM, kaveh minooie <[EMAIL PROTECTED]> wrote:

> thanks Chris. I only run nutch, so no to the external command. And I
> just checked and it happens or has happened on all the nodes at some
> point. I have to say thou that it doesn't cause the job to fail or
> anything. the map tasks that fail will finish when they are re-spawn
> again. it is just annoying and makes me think that some value some where
> in the config files are either not correct or not optimal.
> On 04/22/2013 02:49 PM, Chris Nauroth wrote:
>> I'm not aware of any Hadoop-specific meaning for exit code 126.
>>   Typically, this is a standard Unix exit code used to indicate that a
>> command couldn't be executed.  Some reasons for this might be that the
>> command is not an executable file, or the command is an executable file
>> but the user doesn't have execute permissions.  (See below for an
>> example of each of these.)
>>
>> Does your job code attempt to exec an external command?  Also, are the
>> task failures consistently happening on the same set of nodes in your
>> cluster?  If so, then I recommend checking that the command has been
>> deployed and has the correct permissions on those nodes.
>>
>> Even if your code doesn't exec an external command, various parts of the
>> Hadoop code do this internally, so you still might have a case of a
>> misconfigured node.
>>
>> Hope this helps,
>> --Chris
>>
>> [chris@Chriss-MacBook-Pro:ttys000] hadoop-common
>>  > ./BUILDING.txt
>> -bash: ./BUILDING.txt: Permission denied
>> [chris@Chriss-MacBook-Pro:ttys000] hadoop-common
>>  > echo $?
>> 126
>>
>> [chris@Chriss-MacBook-Pro:ttys000] test
>>  > ls -lrt exec
>> -rwx------  1 root  staff     0B Apr 22 14:43 exec*
>> [chris@Chriss-MacBook-Pro:ttys000] test
>>  > whoami
>> chris
>> [chris@Chriss-MacBook-Pro:ttys000] test
>>  > ./exec
>> bash: ./exec: Permission denied
>> [chris@Chriss-MacBook-Pro:ttys000] test
>>  > echo $?
>> 126
>>
>>
>>
>> On Mon, Apr 22, 2013 at 2:09 PM, kaveh minooie <[EMAIL PROTECTED]
>> <mailto:[EMAIL PROTECTED]>> wrote:
>>
>>     thanks. that is the issue, there is no other log files. when i go to
>>     the attempt directory of that failed map task (e.g.
>>     userlogs/job_201304191712___0015/attempt_201304191712___0015_m_000019_0
>>     ) it is empty. there is no other log file. thou based on the counter
>>     value, I can say that it happens right at the beginning of the map
>>     task (counter is only 1 )
>>
>>
>>
>>
>>     On 04/22/2013 02:12 AM, 姚吉龙 wrote:
>>
>>         Hi
>>
>>
>>         I have the same problem before
>>         I think this is caused by the lack of memory shortage for map task.
>>         It is just a suggestion,you can post your log
>>
>>
>>         BRs
>>         Geelong
>>         —
>>         Sent from Mailbox <https://bit.ly/SZvoJe> for iPhone
>>
>>
>>
>>         On Mon, Apr 22, 2013 at 4:34 PM, kaveh minooie <[EMAIL PROTECTED]
>>         <mailto:[EMAIL PROTECTED]>
>>         <mailto:[EMAIL PROTECTED] <mailto:[EMAIL PROTECTED]>>> wrote:
>>
>>              HI
>>
>>              regardless of what job I run, there are always a few map
>>         tasks that
>>              fail with the following, very unhelpful, message: ( that is the
>>              entire error message)
>>
>>              java.lang.Throwable: Child Error
>>                  at
>>         org.apache.hadoop.mapred.__TaskRunner.run(TaskRunner.__java:271)
>>              Caused by: java.io.IOException: Task process exit with
>>         nonzero status of 126.
>>                  at
>>         org.apache.hadoop.mapred.__TaskRunner.run(TaskRunner.__java:258)
>>
>>
>>              I would appreciate it if someone could show me how I could
>>         figure
>>              out why this error keeps happening.
>>
>>              thanks,
>>
>>
>>
>>     --
>>     Kaveh Minooie
>>
>>
> --
> Kaveh Minooie