-hdfs unable to create new block with 'Too many open fiiles' exception
sam liu 2013-12-21, 16:30
We failed to run an MR job which accesses hive, as hdfs is unable to create
new block during reduce phase. The exceptions:
1) In tasklog:
hdfs.DFSClient: DataStreamer Exception: java.io.IOException: Unable to
create new block
2) In HDFS data node log:
DataXceiveServer: IOException due to:java.io.IOException: Too many open
In hdfs-site.xml, we set 'dfs.datanode.max.xcievers' to 8196. At the same
time, we modified /etc/security/limits.conf to increase nofile of mapred
user to 1048576. But this issue still happen.
Thanks a lot!