-Re: Pig OutOfMemory Error on org.apache.hadoop.mapreduce.Reducer$Context
I 'v found the reason: org.apache.hadoop.io.DataInputBuffer , there's a big byte array referenced by DataInputBuffer.
But there's no any way to close the buffer.
Is there any other solution?
BTW, my script is more than 500 lines, sorry i can not show you.
here's the screenshot:
在 2012-8-20，上午9:49， Haitao Yao 写道：
> Hi, all,
> I got an OOME , on org.apache.hadoop.mapreduce.Reducer$Context, here's the snapshot of the heap dump:
> Well, does pig have to report so many data through the Reducer$Context?
> Can this be closed?
> Haitao Yao
> [EMAIL PROTECTED]
> weibo: @haitao_yao
> Skype: haitao.yao.final