Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Re: YARN: LocalResources and file distribution


Copy link to this message
-
Re: YARN: LocalResources and file distribution
Hi Omkar,
  Thanks for the quick reply. I am now adding a resource to be localized on
the ContainerLaunchContext like this:
localResources.put("hdfs://isredeng:8020/user/kbonagir/KKDummy/list.ksh",
shellRsrc);
        ctx.setLocalResources(localResources);

and referred it as "./list.ksh". Is that enough?

With this change I have gone past the previous error, and now seeing this
error, what else I might be missing?

2013-12-06 05:25:59,480 WARN
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch:
Failed to launch container.
java.io.IOException: Destination must be relative
        at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch$ShellScriptBuilder.symlink(ContainerLaunch.java:474)
        at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.writeLaunchEnv(ContainerLaunch.java:723)
        at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:254)
        at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:79)
        at
java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
        at java.util.concurrent.FutureTask.run(FutureTask.java:166)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:780)

Thanks,
Kishore

On Fri, Dec 6, 2013 at 12:48 PM, omkar joshi <[EMAIL PROTECTED]
> wrote:

> add this file in the files to be localized. (LocalResourceRequest). and
> then refer it as ./list.ksh .. While adding this to LocalResource specify
> the path which you have mentioned.
>
>
> On Thu, Dec 5, 2013 at 10:40 PM, Krishna Kishore Bonagiri <
> [EMAIL PROTECTED]> wrote:
>
>> Hi Arun,
>>
>>   I have copied a shell script to HDFS and trying to execute it on
>> containers. How do I specify my shell script PATH in setCommands() call
>> on ContainerLaunchContext? I am doing it this way
>>
>>       String shellScriptPath >> "hdfs://isredeng:8020/user/kbonagir/KKDummy/list.ksh";
>>       commands.add(shellScriptPath);
>>
>> But my container execution is failing saying that there is No such file
>> or directory!
>>
>> org.apache.hadoop.util.Shell$ExitCodeException: /bin/bash:
>> hdfs://isredeng:8020/user/kbonagir/KKDummy/list.ksh: No such file or
>> directory
>>
>>         at org.apache.hadoop.util.Shell.runCommand(Shell.java:464)
>>         at org.apache.hadoop.util.Shell.run(Shell.java:379)
>>         at
>> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)
>>
>> I could see this file with "hadoop fs" command and also saw messages in
>> Node Manager's log saying that the resource is downloaded and localized.
>> So, how do I run the downloaded shell script on a container?
>>
>> Thanks,
>> Kishore
>>
>>
>>
>> On Tue, Dec 3, 2013 at 4:57 AM, Arun C Murthy <[EMAIL PROTECTED]>wrote:
>>
>>> Robert,
>>>
>>>  YARN, by default, will only download *resource* from a shared namespace
>>> (e.g. HDFS).
>>>
>>>  If /home/hadoop/robert/large_jar.jar is available on each node then you
>>> can specify path as file:///home/hadoop/robert/large_jar.jar and it should
>>> work.
>>>
>>>  Else, you'll need to copy /home/hadoop/robert/large_jar.jar to HDFS and
>>> then specify hdfs://host:port/path/to/large_jar.jar.
>>>
>>> hth,
>>> Arun
>>>
>>> On Dec 1, 2013, at 12:03 PM, Robert Metzger <[EMAIL PROTECTED]> wrote:
>>>
>>> Hello,
>>>
>>> I'm currently writing code to run my application using Yarn (Hadoop
>>> 2.2.0).
>>> I used this code as a skeleton:
>>> https://github.com/hortonworks/simple-yarn-app
>>>
>>> Everything works fine on my local machine or on a cluster with the
>>> shared directories, but when I want to access resources outside of commonly
>>> accessible locations, my application fails.
>>>
>>> I have my application in a large jar file, containing everything
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB