Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Too Many Open Files


Copy link to this message
-
Re: Too Many Open Files
Are you sure you've raised the limits for your user, and have
re-logged in to the machine?

Logged in as the user you run eclipse as, what do you get as the
output if you run "ulimit -n"?

On Thu, Jul 12, 2012 at 3:03 AM, Mike S <[EMAIL PROTECTED]> wrote:
> To debug an specific file, I need to run hadoop in eclipse and eclipse
> keep throwing the Too Many Open File Ecxception. I followed the post
> out there to increase the number of open file per process in
> /etc/security/limits.conf to as high as I my machine accept and still
> I am getting the too many open file exception from java io.
>
> I think the main reason is that I am using a MultipleTextOutputFormat
> and my reducer could create many output files based on the my Muti
> Output logic. Is there a way to make Hadoop not to open so many open
> files. If not, can I control when the reduce to close a file?

--
Harsh J
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB