Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Hadoop >> mail # user >> Best way tune in to Hadoop Heap size Parameter

Copy link to this message
Best way tune in to Hadoop Heap size Parameter
Hi Guys,

We have problem with production Hadoop cluster most of the time we seeing
Java heap size issue.
one of the hadoop component goes to out of Memory Error.

2013-03-08 08:01:10,749 WARN org.apache.hadoop.ipc.Server: IPC Server
handler 57 on 8020, call
org.apache.hadoop.hdfs.server.protocol.DatanodeProtocol.blockReport from *error: java.lang.OutOfMemoryError: Java heap space*
java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2760)
 at java.util.Arrays.copyOf(Arrays.java:2734)
It's any mechanism is there fine tune the Name-node and job tracker heap

In my current scenario, we have 181 TB DFS size, We are keep on putting
data to cluster.

I want any calculation is there [ like formula ], Data size related with
Namenode Heap size and Datanode Heap size.

Please guide me.


Did I learn something today? If not, I wasted it.