Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
HDFS >> mail # user >> Re: parameter to control number of retries


Copy link to this message
-
Re: parameter to control number of retries
Thanks a lot!

On 11/20/12 5:23 PM, "Harsh J" <[EMAIL PROTECTED]> wrote:

>Sure, tweak mapred.map.max.attempts or mapred.reduce.max.attempts in
>your job to set the maximum number of failed task retries for map and
>reduce tasks of a job respectively. Both default to 4.
>
>On Tue, Nov 20, 2012 at 8:44 PM, Serge Blazhiyevskyy
><[EMAIL PROTECTED]> wrote:
>> Hi,
>>
>> I am looking for a parameter to control number of retries for failed
>>task.
>>
>> Can anybody point me to the right direction?
>>
>> Thanks
>> Serge
>
>
>
>--
>Harsh J
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB