Amit Sela 2013-01-24, 12:13
The Job class itself has a blocking and non-blocking submitter that is
similar to JobConf's runJob method you discovered. See
and its following method waitForCompletion(). These seem to be what
you're looking for.
On Thu, Jan 24, 2013 at 5:43 PM, Amit Sela <[EMAIL PROTECTED]> wrote:
> Hi all,
> I want to run a MapReduce job using the Hadoop Java api from my analytics
> server. It is not the master or even a data node but it has the same Hadoop
> installation as all the nodes in the cluster.
> I tried using JobClient.runJob() but it accepts JobConf as argument and when
> using JobConf it is possible to set only mapred Mapper classes and I use
> I tried using JobControl and ControlledJob but it seems like it tries to run
> the job locally. the map phase just keeps attempting...
> Anyone tried it before ?
> I'm just looking for a way to submit MapReduce jobs from Java code and be
> able to monitor them.
Amit Sela 2013-01-24, 16:15
Amit Sela 2013-01-27, 11:43
Panshul Whisper 2013-01-27, 11:53