-Re: YARN: LocalResources and file distribution
Krishna Kishore Bonagiri 2013-12-06, 11:03
Thanks for the quick reply. I am now adding a resource to be localized on
the ContainerLaunchContext like this:
and referred it as "./list.ksh". Is that enough?
With this change I have gone past the previous error, and now seeing this
error, what else I might be missing?
2013-12-06 05:25:59,480 WARN
Failed to launch container.
java.io.IOException: Destination must be relative
On Fri, Dec 6, 2013 at 12:48 PM, omkar joshi <[EMAIL PROTECTED]
> add this file in the files to be localized. (LocalResourceRequest). and
> then refer it as ./list.ksh .. While adding this to LocalResource specify
> the path which you have mentioned.
> On Thu, Dec 5, 2013 at 10:40 PM, Krishna Kishore Bonagiri <
> [EMAIL PROTECTED]> wrote:
>> Hi Arun,
>> I have copied a shell script to HDFS and trying to execute it on
>> containers. How do I specify my shell script PATH in setCommands() call
>> on ContainerLaunchContext? I am doing it this way
>> String shellScriptPath >> "hdfs://isredeng:8020/user/kbonagir/KKDummy/list.ksh";
>> But my container execution is failing saying that there is No such file
>> or directory!
>> org.apache.hadoop.util.Shell$ExitCodeException: /bin/bash:
>> hdfs://isredeng:8020/user/kbonagir/KKDummy/list.ksh: No such file or
>> at org.apache.hadoop.util.Shell.runCommand(Shell.java:464)
>> at org.apache.hadoop.util.Shell.run(Shell.java:379)
>> I could see this file with "hadoop fs" command and also saw messages in
>> Node Manager's log saying that the resource is downloaded and localized.
>> So, how do I run the downloaded shell script on a container?
>> On Tue, Dec 3, 2013 at 4:57 AM, Arun C Murthy <[EMAIL PROTECTED]>wrote:
>>> YARN, by default, will only download *resource* from a shared namespace
>>> (e.g. HDFS).
>>> If /home/hadoop/robert/large_jar.jar is available on each node then you
>>> can specify path as file:///home/hadoop/robert/large_jar.jar and it should
>>> Else, you'll need to copy /home/hadoop/robert/large_jar.jar to HDFS and
>>> then specify hdfs://host:port/path/to/large_jar.jar.
>>> On Dec 1, 2013, at 12:03 PM, Robert Metzger <[EMAIL PROTECTED]> wrote:
>>> I'm currently writing code to run my application using Yarn (Hadoop
>>> I used this code as a skeleton:
>>> Everything works fine on my local machine or on a cluster with the
>>> shared directories, but when I want to access resources outside of commonly
>>> accessible locations, my application fails.
>>> I have my application in a large jar file, containing everything