Hi I have done some tests in my Pseudo Mode(CDH4.1.2)with MV2 yarn,and : According to the doc: *mapreduce.jobtracker.address :*The host and port that the MapReduce job tracker runs at. If "local", then jobs are run in-process as a single map and reduce task. *mapreduce.job.maps (default value is 2)* :The default number of map tasks per job. Ignored when mapreduce.jobtracker.address is "local".
I changed the mapreduce.jobtracker.address = Hadoop:50031.
And then run the wordcount examples: hadoop jar hadoop-mapreduce-examples-2.0.0-cdh4.1.2.jar wordcount input output
the output logs are as follows: .... Job Counters Launched map tasks=1 Launched reduce tasks=1 Data-local map tasks=1 Total time spent by all maps in occupied slots (ms)=60336 Total time spent by all reduces in occupied slots (ms)=63264 Map-Reduce Framework Map input records=5 Map output records=7 Map output bytes=56 Map output materialized bytes=76 ....
i seem to does not work.
I thought maybe my input file is small-just 5 records . is it right?
2013/3/14 Sai Sai <[EMAIL PROTECTED]>
> > > In Pseudo Mode where is the setting to increase the number of mappers or > is this not possible. > Thanks > Sai >
在 2013-3-15，18:32，Zheyi RONG <[EMAIL PROTECTED]> 写道：
> Indeed you cannot explicitly set the number of mappers, but still you can gain some control over it, by setting mapred.max.split.size, or mapred.min.split.size. > > For example, if you have a file of 10GB (10737418240 B), you would like 10 mappers, then each mapper has to deal with 1GB data. > According to "splitsize = max(minimumSize, min(maximumSize, blockSize))", you can set mapred.min.split.size=1073741824 (1GB), i.e. > $hadoop jar -Dmapred.min.split.size=1073741824 yourjar yourargs > > It is well explained in thread: http://stackoverflow.com/questions/9678180/change-file-split-size-in-hadoop. > > Regards, > Zheyi. > > On Fri, Mar 15, 2013 at 8:49 AM, YouPeng Yang <[EMAIL PROTECTED]> wrote: >> s > >
All projects made searchable here are trademarks of the Apache Software Foundation.
Service operated by Sematext