Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HDFS, mail # user - Submitting MapReduce job from remote server using JobClient


Copy link to this message
-
Re: Submitting MapReduce job from remote server using JobClient
Harsh J 2013-01-24, 15:12
The Job class itself has a blocking and non-blocking submitter that is
similar to JobConf's runJob method you discovered. See
http://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapreduce/Job.html#submit()
and its following method waitForCompletion(). These seem to be what
you're looking for.

On Thu, Jan 24, 2013 at 5:43 PM, Amit Sela <[EMAIL PROTECTED]> wrote:
> Hi all,
>
> I want to run a MapReduce job using the Hadoop Java api from my analytics
> server. It is not the master or even a data node but it has the same Hadoop
> installation as all the nodes in the cluster.
> I tried using JobClient.runJob() but it accepts JobConf as argument and when
> using JobConf it is possible to set only mapred Mapper classes and I use
> mapreduce...
> I tried using JobControl and ControlledJob but it seems like it tries to run
> the job locally. the map phase just keeps attempting...
> Anyone tried it before ?
> I'm just looking for a way to submit MapReduce jobs from Java code and be
> able to monitor them.
>
> Thanks,
>
> Amit.

--
Harsh J