Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce, mail # user - Question related to Number of Mapper


Copy link to this message
-
Re: Question related to Number of Mapper
Michael Segel 2012-11-07, 17:30
The larger question is how many blocks are required to store a 100MB file if the HDFS block size is 64MB.

If it takes 2 blocks then when you run your job, you will have 1 mapper per block, unless the file is not splittable. (But from your example its a simple text file which is splittable.)

Again, where are you getting these questions?
On Nov 7, 2012, at 10:46 AM, Ramasubramanian Narayanan <[EMAIL PROTECTED]> wrote:

> Hi,
>
> Thanks!
>
> But it is given as 100 Mappers... I think we can also use 'n' number of Mappers as the same as the number of input files... (not for this question)... If you know more detail on that please share..
>
> Note : I forgot from where this question I taken :)
>
> regards,
> Rams.
>
> On Wed, Nov 7, 2012 at 10:01 PM, Michael Segel <[EMAIL PROTECTED]> wrote:
> 0 Custer didn't run. He got surrounded and then massacred.  :-P (See Custer's last stand at Little Big Horn)
>
> Ok... plain text files 100 files 2 blocks each would by default attempt to schedule 200 mappers.
>
> Is this one of those online Cert questions?
>
> On Nov 7, 2012, at 10:20 AM, Ramasubramanian Narayanan <[EMAIL PROTECTED]> wrote:
>
> > Hi,
> >
> > If a  Custer’s HDFS block size is 64MB. You have a directory containing 100 plain text files, each of which Is 100MB in size. The InputFormat for your job is TextInputFormat. How many Mappers will run?
> >
> > regards,
> > Rams
> >
> >
> >
>
>