Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce, mail # user - Error reading task output


Copy link to this message
-
Re: Error reading task output
Ben Kim 2012-07-27, 17:04
Bejoy,
Thanks alot for your response. You are right. The problem is with the
misconfigured nproc on the OS.
Originally, my limits.conf file was something like this:

* hard    nofile    1000000
 * soft    nofile    1000000
 * hard    nproc    320000
 * soft    nproc    320000

but for some reason linux hadn't taken the * wild-card for nproc.
(nofile has been applied correctly). It's a very absurd thing, but I
changed it like following

*   hard    nofile    1000000
*   soft    nofile    1000000
root soft nproc 320000
root hard nproc 320000
hadoop soft nproc 320000
hadoop hard nproc 320000
and the problem is solved!

Thanks again for your help!

Ben
On Fri, Jul 27, 2012 at 9:03 PM, Bejoy Ks <[EMAIL PROTECTED]> wrote:

> Hi Ben
>
> This error happens when the mapreduce job triggers more number of
> process than allowed by the underlying OS. You need to increase the
> nproc value if it is the default one.
>
> You  can get the current values from linux using
> ulimit -u
> The default is 1024 I guess. Check that for the user that runs
> mapreduce jobs, for a non security enabled cluster it is mapred.
>
> You need to increase this to a laarge value using
> mapred soft nproc 10000
> mapred hard nproc 10000
>
> If you are running on a security enabled cluster, this value should be
> raised for the user who submits the job.
>
> Regards
> Bejoy KS
>

--

*Benjamin Kim*
*benkimkimben at gmail*