Abhishek Shivkumar 2013-03-20, 06:40
-Re: JAVA heap error for the tasks in mapreduce
Manoj Babu 2013-03-20, 06:48
you have to increase the upper limit also check mapred.child.ulimit
On Wed, Mar 20, 2013 at 12:10 PM, Abhishek Shivkumar <
[EMAIL PROTECTED]> wrote:
> I have a setup() method in the Mapper.java class where I am reading in
> a 1.6 GB HashMap that was serialized into a file and stored in HDFS. When I
> am running the job, it gets stuck at the readobject() method that reads
> this serialized file into a HashMap.
> I increased the heap size both by doing export HADOOP_HEAPSIZE=4096 and
> also writing *conf.set("mapred.map.child.opts", "-Xmx4096M);* and *conf.set("mapred.reduce.child.opts",
> It still doesn't help. Should we do something else? If I enter the
> HADOOP_HEAPSIZE beyond this, it doesn't run the hadoop command and fails to
> instantiate a JVM.
> Any comments would be appreciated!
> Thank you!
> With Regards,
> Abhishek S