Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HDFS, mail # user - Too many open files error with YARN

Copy link to this message
Too many open files error with YARN
Krishna Kishore Bonagiri 2013-03-20, 11:24

 I am running a date command with YARN's distributed shell example in a
loop of 1000 times in this way:

yarn jar
org.apache.hadoop.yarn.applications.distributedshell.Client --jar
--shell_command date --num_containers 2
Around 730th time or so, I am getting an error in node manager's log saying
that it failed to launch container because there are "Too many open files"
and when I observe through lsof command,I find that there is one instance
of this kind of file is left for each run of Application Master, and it
kept growing as I am running it in loop.


Is this a known issue? Or am I missing doing something? Please help.

Note: I am working on hadoop--2.0.0-alpha