It looks like there is something wrong with your configuration where the default file system is coming back as the local file system, but you are passing in an HDFS URI fs.exists(Path), I cannot tell for sure because I don't have access to com.amd.kdf.protobuf.SequentialFileDriver.main(SequentialFileDriver.java:64).
If running it works just fine from the command line, you could try doing a fork/exec to launch the process and then monitor it.
On 2/2/12 11:31 PM, "Abees Muhammad" <[EMAIL PROTECTED]> wrote:
Thanks for your reply. I have a mapreduce job jar file lets call it as test.jar. I am executing this jar file as hadoop jar test.jar inputpath outPath, and it is executed succesfully. Now i want to execute this job for a batch of files(a batch of 20 files), for this purpose i have created another java application,this application moves a batch of files from one location of hdfs to another location in hdfs. After that this application needs to execute the m/R job for this batch. we will invoke the second application(which will execute the M/R Job) from as control m job.But i dont know how to create the second java application which will invoke the M/R job. The code snippet i used for testing the jar which calls the M/R job is
List arguments = new ArrayList();
i executed this jar as java -jar M/RJobInvokeApp.jar but i got error as
java.lang.IllegalArgumentException: Wrong FS: hdfs://ip address:54310/tmp/test-out, expected: file:///
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
On 2 February 2012 23:02, Robert Evans <[EMAIL PROTECTED]> wrote:
What happens? Is there an exception, does nothing happen? I am curious. Also how did you launch your other job that is trying to run this one. The hadoop script sets up a lot of environment variables classpath etc to make hadoop work properly, and some of that may not be set up correctly to make RunJar work.
On 2/2/12 9:36 AM, "Harsh J" <[EMAIL PROTECTED] <http://[EMAIL PROTECTED]> > wrote:
Moving to common-user. Common-dev is for project development
discussions, not user help.
Could you elaborate on how you used RunJar? What arguments did you
provide, and is the target jar a runnable one or a regular jar? What
error did you get?
On Thu, Feb 2, 2012 at 8:44 PM, abees muhammad <[EMAIL PROTECTED] <http://[EMAIL PROTECTED]> > wrote:
> I am a newbie to Hadoop Development. I have a Map/Reduce job jar file, i
> want to execute this jar file programmatically from another java program. I
> used the following code to execute it.
> RunJar.main(String args). But The jar file is not executed.
> Can you please give me a work around for this issue.
> View this message in context: http://old.nabble.com/Execute-a-Map-Reduce-Job-Jar-from-Another-Java-Program.-tp33250801p33250801.html
> Sent from the Hadoop core-dev mailing list archive at Nabble.com.
Customer Ops. Engineer
Cloudera | http://tiny.cloudera.com/about