Ramasubramanian Narayanan... 2012-11-07, 16:11
You are correct. (D) automatically does (B).
On Wed, Nov 7, 2012 at 9:41 PM, Ramasubramanian Narayanan
<[EMAIL PROTECTED]> wrote:
> I came across the below question and I feel 'D' is the correct answer but in
> some site it is mentioned that 'B' is the correct answer... Can you please
> tell which is the right one with explanation pls...
> In a MapReduce job, you want each of you input files processed by a single
> map task. How do you
> configure a MapReduce job so that a single map task processes each input
> file regardless of how
> many blocks the input file occupies?
> A. Increase the parameter that controls minimum split size in the job
> B. Write a custom MapRunner that iterates over all key-value pairs in the
> entire file.
> C. Set the number of mappers equal to the number of input files you want to
> D. Write a custom FileInputFormat and override the method isSplittable to
> always return false.