Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HDFS >> mail # user >> set the number of reduce tasks in the wordcount by command line


Copy link to this message
-
Re: set the number of reduce tasks in the wordcount by command line
Yeah, thanks Krishna for pointing out Yarn specific property name.

Regards,
Shahab
On Wed, Sep 25, 2013 at 6:54 PM, Krishna Pisupat
<[EMAIL PROTECTED]>wrote:

> You can invoke the setNumReduceTasks on the Job object that you use to run
> the MR job.
>
>
> http://hadoop.apache.org/docs/r2.0.6-alpha/api/org/apache/hadoop/mapreduce/Job.html#setNumReduceTasks(int)
>
> Or else you can set the property mapreduce.job.reduces in mapred-site.xml
>
>
> mapreduce.job.reduces1The default number of reduce tasks per job.
> Typically set to 99% of the cluster's reduce capacity, so that if a node
> fails the reduces can still be executed in a single wave. Ignored when
> mapreduce.jobtracker.address is "local".
>
>
>
>
> On Sep 25, 2013, at 3:17 PM, xeon <[EMAIL PROTECTED]> wrote:
>
>  In yarn 2.0.5, where I set this?
>
> On 09/25/2013 11:16 PM, Shahab Yunus wrote:
>
> Have you tried setting *mapred.reduce.tasks *property?
>
>  Regards,
> Shahab
>
>
> On Wed, Sep 25, 2013 at 6:01 PM, xeon <[EMAIL PROTECTED]> wrote:
>
>> is it possible to set the number of reduce tasks in the wordcount example
>> when I launch the job by command line?
>>
>> Thanks
>>
>
>
>
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB