Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce, mail # user - Out of memory (heap space) errors on job tracker

David Rosenstrauch 2012-06-08, 15:26
Copy link to this message
Re: Out of memory (heap space) errors on job tracker
Arun C Murthy 2012-06-08, 18:59
This shouldn't be happening at all...

What version of hadoop are you running? Potentially you need configs to protect the JT that you are missing, those should ensure your hadoop-1.x JT is very reliable.


On Jun 8, 2012, at 8:26 AM, David Rosenstrauch wrote:

> Our job tracker has been seizing up with Out of Memory (heap space) errors for the past 2 nights.  After the first night's crash, I doubled the heap space (from the default of 1GB) to 2GB before restarting the job.  After last night's crash I doubled it again to 4GB.
> This all seems a bit puzzling to me.  I wouldn't have thought that the job tracker should require so much memory.  (The NameNode, yes, but not the job tracker.)
> Just wondering if this behavior sounds reasonable, or if perhaps there might be a bigger problem at play here.  Anyone have any thoughts on the matter?
> Thanks,
> DR

Arun C. Murthy
Hortonworks Inc.
David Rosenstrauch 2012-06-08, 19:07
Harsh J 2012-06-10, 05:40
Arun C Murthy 2012-06-11, 00:39
David Rosenstrauch 2012-06-12, 21:22
David Rosenstrauch 2012-06-12, 21:20