Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce >> mail # user >> parameter to control number of retries


+
Serge Blazhiyevskyy 2012-11-20, 15:14
Copy link to this message
-
Re: parameter to control number of retries
Sure, tweak mapred.map.max.attempts or mapred.reduce.max.attempts in
your job to set the maximum number of failed task retries for map and
reduce tasks of a job respectively. Both default to 4.

On Tue, Nov 20, 2012 at 8:44 PM, Serge Blazhiyevskyy
<[EMAIL PROTECTED]> wrote:
> Hi,
>
> I am looking for a parameter to control number of retries for failed task.
>
> Can anybody point me to the right direction?
>
> Thanks
> Serge

--
Harsh J
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB