Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HDFS >> mail # user >> Too many open files error with YARN


Copy link to this message
-
Too many open files error with YARN
Hi,

 I am running a date command with YARN's distributed shell example in a
loop of 1000 times in this way:

yarn jar
/home/kbonagir/yarn/hadoop-2.0.0-alpha/share/hadoop/mapreduce/hadoop-yarn-applications-distributedshell-2.0.0-alpha.jar
org.apache.hadoop.yarn.applications.distributedshell.Client --jar
/home/kbonagir/yarn/hadoop-2.0.0-alpha/share/hadoop/mapreduce/hadoop-yarn-applications-distributedshell-2.0.0-alpha.jar
--shell_command date --num_containers 2
Around 730th time or so, I am getting an error in node manager's log saying
that it failed to launch container because there are "Too many open files"
and when I observe through lsof command,I find that there is one instance
of this kind of file is left for each run of Application Master, and it
kept growing as I am running it in loop.

node1:44871->node1:50010

Is this a known issue? Or am I missing doing something? Please help.

Note: I am working on hadoop--2.0.0-alpha

Thanks,
Kishore