Ranjini Rathinam 2013-12-06, 11:00
-Re: hadoop -Mapreduce
Subroto 2013-12-06, 11:07
The number of mappers depend on InputSplits and which intern depends on size of input data.
The number of reducers can be configured by "mapred.reduce.tasks"
Further you can get more information on numbers of Maps and Reduce for a Job from:
Running parallel mappers and reducers depends on availability of map and reduce slots in the cluster.
On Dec 6, 2013, at 12:00 PM, Ranjini Rathinam wrote:
> How to run more than one mapper and reduce parallelly.?
> Please suggest.Thanks in advance.
Subroto 2013-12-06, 11:12
Ranjini Rathinam 2013-12-09, 13:00
Shekhar Sharma 2013-12-09, 16:53
Shekhar Sharma 2013-12-09, 16:06
Ranjini Rathinam 2013-12-11, 11:25