Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
HBase >> mail # user >> Too many open files (java.net.SocketException)


+
Jean-Marc Spaggiari 2013-04-09, 15:33
+
Ted Yu 2013-04-09, 16:00
Copy link to this message
-
Re: Too many open files (java.net.SocketException)
But there was not any trace looking like "OutOfMemoryError". nproc
might has result with this, no? Not a SocketException?
Anyway, I have increased it to 32768. I will see if I face that again.

Thanks,

JM

2013/4/9 Ted Yu <[EMAIL PROTECTED]>:
> According to http://hbase.apache.org/book.html#ulimit , you should increase
> nproc setting.
>
> Cheers
>
> On Tue, Apr 9, 2013 at 8:33 AM, Jean-Marc Spaggiari <[EMAIL PROTECTED]
>> wrote:
>
>> Hi,
>>
>> I just faced an issue this morning on one of my RS.
>>
>> Here is an extract of the logs
>> 2013-04-09 11:05:33,164 ERROR org.apache.hadoop.hdfs.DFSClient:
>> Exception closing file
>>
>> /hbase/entry_proposed/ae4a5d72d4613728ddbcc5a64262371b/.tmp/ed6a0154ef714cd88faf26061cf248d3
>> : java.net.SocketException: Too many open files
>> java.net.SocketException: Too many open files
>>         at sun.nio.ch.Net.socket0(Native Method)
>>         at sun.nio.ch.Net.socket(Net.java:323)
>>         at sun.nio.ch.Net.socket(Net.java:316)
>>         at sun.nio.ch.SocketChannelImpl.<init>(SocketChannelImpl.java:101)
>>         at
>> sun.nio.ch.SelectorProviderImpl.openSocketChannel(SelectorProviderImpl.java:60)
>>         at java.nio.channels.SocketChannel.open(SocketChannel.java:142)
>>         at
>> org.apache.hadoop.net.StandardSocketFactory.createSocket(StandardSocketFactory.java:58)
>>         at
>> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.createBlockOutputStream(DFSClient.java:3423)
>>         at
>> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3381)
>>         at
>> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2600(DFSClient.java:2589)
>>         at
>> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2829)
>>
>> ulimit is unlimited on all my servers.
>>
>> Is seems there was to many network connections opened. Is there
>> anything HBase can handle in such scenario? It's only hadoop in the
>> stacktrace, so I'm not sure.
>>
>> Can this be related to nproc? I don't think so. I have another tool
>> running on the RS. Using low CPU, low bandwidth but making MANY
>> network HTTP connections...
>>
>> Any suggestion?
>>
>> JM
>>
+
Andrew Purtell 2013-04-09, 16:36
+
Ted 2013-04-10, 00:53