You can definitely run the Driver (ClassWithMain) to a remote hadoop
cluster from say Eclipse following the steps under
a) Have the jar (Some.jar) in your classpath of your project in Eclipse .
b) Ensure you have set both the Namenode and Job Tracker information either
in core-site.xml and mapred-site.xml or through conf.setXXXX
c) In the main method of the Driver class havet the following , Below, *hdfs
*is a user who has permissions to run jobs on the hadoop cluster.
public static void main (
final String args)
int status = 0;
UserGroupInformation ugi UserGroupInformation.createRemoteUser("*hdfs*");
status = ugi.doAs(new PrivilegedExceptionAction<Integer>()
public Integer run ()
int result = ToolRunner.run(new Driver(), args);
On Fri, Aug 23, 2013 at 9:37 AM, 정재부 <[EMAIL PROTECTED]> wrote:
> I commonly make a executable jar package with a main method and run by
> the commandline "hadoop jar Some.jar ClassWithMain input output"
> In this main method, Job and Configuration may be configured and
> Configuration class has a setter to specify mapper or reducer class like
> However, In the case of submitting job remotely, I should set jar and
> Mapper or more classes to use hadoop client api.
> I want to programmatically transfer jar in client to remote hadoop cluster
> and execute this jar like "hadoop jar" command to make main method specify
> mapper and reducer.
> So how can I deal with this problem?