Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce, mail # user - Execution directory for child process within mapper


Copy link to this message
-
RE: Execution directory for child process within mapper
Devaraj k 2011-09-26, 18:40
Hi Joris,

You cannot configure the work directory directly. You can configure the local directory with property 'mapred.local.dir', and it will be used further to create the work directory like '${mapred.local.dir}/taskTracker/jobcache/$jobid/$taskid/work'. Based on this, you can relatively refer your local command to execute.

I hope this page will help you to understand the directory structure clearly. http://hadoop.apache.org/common/docs/r0.20.2/mapred_tutorial.html#Directory+Structure
Thanks
Devaraj
________________________________________
From: Joris Poort [[EMAIL PROTECTED]]
Sent: Monday, September 26, 2011 11:20 PM
To: mapreduce-user
Subject: Execution directory for child process within mapper

As part of my Java mapper I have a command executes some standalone
code on a local slave node. When I run a code it executes fine, unless
it is trying to access some local files in which case I get the error
that it cannot locate those files.

Digging a little deeper it seems to be executing from the following directory:

    /data/hadoop/mapred/local/taskTracker/{user}/jobcache/job_201109261253_0023/attempt_201109261253_0023_m_000001_0/work

But I am intending to execute from a local directory where the
relevant files are located:

    /home/users/{user}/input/jobname

Is there a way in java/hadoop to force the execution from the local
directory, instead of the jobcache directory automatically created in
hadoop?

Is there perhaps a better way to go about this?

Any help on this would be greatly appreciated!

Cheers,

Joris