Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Pig >> mail # user >> Pig OutOfMemory Error on org.apache.hadoop.mapreduce.Reducer$Context


Copy link to this message
-
Re: Pig OutOfMemory Error on org.apache.hadoop.mapreduce.Reducer$Context
Thank you very much for this.
But I still can not find which snippet of the script caused this OOM. The heap dump is generated in the midnight, nobody is standing by.

I 've added some scripts to capture more information about the job. If succeed,  I 'll share with you tomorrow.
Thanks again.

Haitao Yao
[EMAIL PROTECTED]
weibo: @haitao_yao
Skype:  haitao.yao.final

在 2012-8-21,下午5:22, Subir S 写道:

> I think it helps to get some context, if you can spin a small snippet from your script which causes this issue. This may give a better understanding of what you are trying to achieve, and if there is some work around. However this is upto you!
>
> Thanks, Subir
>
> On Tue, Aug 21, 2012 at 2:12 PM, Haitao Yao <[EMAIL PROTECTED]> wrote:
> I 'v found the reason: org.apache.hadoop.io.DataInputBuffer , there's a big byte array referenced by DataInputBuffer.
>
> But there's no any way to close the buffer.
>
> Is  there any other solution?
>
> BTW, my script is more than 500 lines, sorry i can not show you.
>
> here's the screenshot:
>
> <aa.jpg>
>
>
>
>
> <bb.jpg>
>
>
>
> Haitao Yao
> [EMAIL PROTECTED]
> weibo: @haitao_yao
> Skype:  haitao.yao.final
>
> 在 2012-8-20,上午9:49, Haitao Yao 写道:
>
>> Hi, all,
>> I got an OOME , on org.apache.hadoop.mapreduce.Reducer$Context, here's the snapshot of the heap dump:
>> <aa.jpg>
>>
>> Well, does pig have to report so many data through  the Reducer$Context?
>> Can this be closed?
>>
>> thanks.
>>
>>
>>
>> Haitao Yao
>> [EMAIL PROTECTED]
>> weibo: @haitao_yao
>> Skype:  haitao.yao.final
>>
>
>