Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Pig >> mail # user >> Pig OutOfMemory Error on org.apache.hadoop.mapreduce.Reducer$Context


Copy link to this message
-
Re: Pig OutOfMemory Error on org.apache.hadoop.mapreduce.Reducer$Context
I 'v found the reason: org.apache.hadoop.io.DataInputBuffer , there's a big byte array referenced by DataInputBuffer.

But there's no any way to close the buffer.

Is  there any other solution?

BTW, my script is more than 500 lines, sorry i can not show you.

here's the screenshot:


Haitao Yao
[EMAIL PROTECTED]
weibo: @haitao_yao
Skype:  haitao.yao.final

在 2012-8-20,上午9:49, Haitao Yao 写道:

> Hi, all,
> I got an OOME , on org.apache.hadoop.mapreduce.Reducer$Context, here's the snapshot of the heap dump:
> <aa.jpg>
>
> Well, does pig have to report so many data through  the Reducer$Context?
> Can this be closed?
>
> thanks.
>
>
>
> Haitao Yao
> [EMAIL PROTECTED]
> weibo: @haitao_yao
> Skype:  haitao.yao.final
>