-Re: Locked Job
Michael Segel 2012-08-27, 13:19
If you are in Mapper.map() method and you 'lock', then you will most certainly time out after 10min by default and the task dies. Enough tasks die, then your job dies.
If in Mapper.setup() method you create a heartbeat thread where every minute you wake up and update the thread's status with a changing value like time in ms, you can keep the thread alive.
Having said that... you really don't want to do what you're thinking.
Like you really don't want to do it because you tie up a slot and that limits the ability for another job to take its place and run.
On Aug 27, 2012, at 8:13 AM, Juan P. <[EMAIL PROTECTED]> wrote:
> Hi guys!
> I need some clarification on the expected behavior for a hadoop MapReduce job.
> Say I was to create a Mapper task which never ends. It reads the first line of input and then reads data from an external service eternally. If the service is empty it will lock until data is available.
> Will the jobtracker continue to receive the Heartbeat?
> Will the jobtracker kill the task at some point?
> I know that it's not the way Hadoop was intended to be used, I just need to clarify this specific scenario.
> Thank you!