Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Too Many Open Files


Copy link to this message
-
Re: Too Many Open Files
Mike,

Understood. Then you may need to use
http://wiki.apache.org/hadoop/FAQ#Can_I_write_create.2BAC8-write-to_hdfs_files_directly_from_map.2BAC8-reduce_tasks.3F
instead of MultipleTextOutputFormat.

On Thu, Jul 12, 2012 at 11:29 AM, Mike S <[EMAIL PROTECTED]> wrote:
> 100% sure I have done and again the problem is not becuase my
> configuration is kicking it. The problem is that my application uses
> MultipleTextOutputFormat that may create 500 000 files and linux does
> allow that many open files for whatever reason. If I set the limit too
> high, it will ignore it.
>
> On Wed, Jul 11, 2012 at 10:12 PM, Harsh J <[EMAIL PROTECTED]> wrote:
>> Are you sure you've raised the limits for your user, and have
>> re-logged in to the machine?
>>
>> Logged in as the user you run eclipse as, what do you get as the
>> output if you run "ulimit -n"?
>>
>> On Thu, Jul 12, 2012 at 3:03 AM, Mike S <[EMAIL PROTECTED]> wrote:
>>> To debug an specific file, I need to run hadoop in eclipse and eclipse
>>> keep throwing the Too Many Open File Ecxception. I followed the post
>>> out there to increase the number of open file per process in
>>> /etc/security/limits.conf to as high as I my machine accept and still
>>> I am getting the too many open file exception from java io.
>>>
>>> I think the main reason is that I am using a MultipleTextOutputFormat
>>> and my reducer could create many output files based on the my Muti
>>> Output logic. Is there a way to make Hadoop not to open so many open
>>> files. If not, can I control when the reduce to close a file?
>>
>>
>>
>> --
>> Harsh J

--
Harsh J