Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce, mail # user - Re: Provide context to map function


Copy link to this message
-
Re: Provide context to map function
Abhinav M Kulkarni 2013-04-02, 04:08
To be precise, I am using Hadoop 1.0.4.

There is no local variable or argument named context in the map function.

Thanks,
Abhinav

On 04/01/2013 09:06 PM, Azuryy Yu wrote:
> I supposed your input splits are FileSplit, if not, you need to:
>
> InputSplit split = context.getInputSplit();
>
> if (split instanceof FileSplit){
>   Path path = ((FileSplit)split).getPath();
> }
>
>
>
>
> On Tue, Apr 2, 2013 at 12:02 PM, Azuryy Yu <[EMAIL PROTECTED]
> <mailto:[EMAIL PROTECTED]>> wrote:
>
>     In your map function add following:
>
>     Path currentInput = ((FileSplit)context.getInputSplit()).getPath();
>
>     then:
>
>     if (currentInput is first ){
>     ................
>     }
>     else{
>     ..................
>     }
>
>
>
>
>     On Tue, Apr 2, 2013 at 11:55 AM, Abhinav M Kulkarni
>     <[EMAIL PROTECTED] <mailto:[EMAIL PROTECTED]>> wrote:
>
>         Hi,
>
>         I have a following scenario:
>
>           * Two mappers (acting on two different files) and one reducer
>           * The mapper code for two different files is the same,
>             except for minor change which depends on which file is
>             being read
>           * Essentially assume there is an if statement - if first
>             file is being read do this else do this
>
>         So how do I provide this context to map function i.e. file
>         name or say a boolean flag variable indicating the file being
>         read?
>
>         Thanks,
>         Abhinav
>
>
>

+
Abhinav M Kulkarni 2013-04-02, 16:17