Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Pig >> mail # user >> Java heap error


Copy link to this message
-
Re: Java heap error
Syed,

One line stack traces arent much helpful :) Please provide the full stack
trace and the pig script which produced it and we can take a look.

Ashutosh
On Wed, Jul 7, 2010 at 14:09, Syed Wasti <[EMAIL PROTECTED]> wrote:

>
> I am running my Pig scripts on our QA cluster (with 4 datanoes, see blelow)
> and has Cloudera CDH2 release installed and global heap max is –Xmx4096m.I am constantly getting OutOfMemory errors (see below) on my map and reduce
> jobs, when I try run my script against large data where it produces around
> 600 maps.
> Looking for some tips on the best configuration for pig and to get rid of
> these errors. Thanks.
>
>
>
> Error: GC overhead limit exceededError: java.lang.OutOfMemoryError: Java
> heap space
>
> Regards
> Syed
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB