Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> Execution directory for child process within mapper

Copy link to this message
RE: Execution directory for child process within mapper
Hi Joris,

You cannot configure the work directory directly. You can configure the local directory with property 'mapred.local.dir', and it will be used further to create the work directory like '${mapred.local.dir}/taskTracker/jobcache/$jobid/$taskid/work'. Based on this, you can relatively refer your local command to execute.

I hope this page will help you to understand the directory structure clearly. http://hadoop.apache.org/common/docs/r0.20.2/mapred_tutorial.html#Directory+Structure
From: Joris Poort [[EMAIL PROTECTED]]
Sent: Monday, September 26, 2011 11:20 PM
To: mapreduce-user
Subject: Execution directory for child process within mapper

As part of my Java mapper I have a command executes some standalone
code on a local slave node. When I run a code it executes fine, unless
it is trying to access some local files in which case I get the error
that it cannot locate those files.

Digging a little deeper it seems to be executing from the following directory:


But I am intending to execute from a local directory where the
relevant files are located:


Is there a way in java/hadoop to force the execution from the local
directory, instead of the jobcache directory automatically created in

Is there perhaps a better way to go about this?

Any help on this would be greatly appreciated!