Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce >> mail # user >> JAVA heap error for the tasks in mapreduce


+
Abhishek Shivkumar 2013-03-20, 06:40
Copy link to this message
-
Re: JAVA heap error for the tasks in mapreduce
Hi,

you have to increase the upper limit also check mapred.child.ulimit
property.

Cheers!
Manoj.
On Wed, Mar 20, 2013 at 12:10 PM, Abhishek Shivkumar <
[EMAIL PROTECTED]> wrote:

> Hi,
>
>     I have a setup() method in the Mapper.java class where I am reading in
> a 1.6 GB HashMap that was serialized into a file and stored in HDFS. When I
> am running the job, it gets stuck at the readobject() method that reads
> this serialized file into a HashMap.
>
> I increased the heap size both by doing export HADOOP_HEAPSIZE=4096 and
> also writing *conf.set("mapred.map.child.opts", "-Xmx4096M);* and *conf.set("mapred.reduce.child.opts",
> "-Xmx4096M);*
> *
> *
> It still doesn't help. Should we do something else? If I enter the
> HADOOP_HEAPSIZE beyond this, it doesn't run the hadoop command and fails to
> instantiate a JVM.
>
> Any comments would be appreciated!
>
> Thank you!
>
> With Regards,
> Abhishek S
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB