Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> Job launch from eclipse


Copy link to this message
-
Re: Job launch from eclipse
Hell Han,

      The reason behind this is that the jobs are running inside the
Eclipse itself and not getting submitted to your cluster. Please see if
this links helps :
http://cloudfront.blogspot.in/2013/03/mapreduce-jobs-running-through-eclipse.html#.UXaQsDWH6IQ
Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com
On Tue, Apr 23, 2013 at 6:56 PM, shashwat shriparv <
[EMAIL PROTECTED]> wrote:

> You need to generate a jar file, pass all the parameters on run time if
> any is fixed and run at hadoop like hadoop -jar jarfilename.jar <parameters>
>
> *Thanks & Regards    *
>
> ∞
> Shashwat Shriparv
>
>
>
> On Tue, Apr 23, 2013 at 6:51 PM, Han JU <[EMAIL PROTECTED]> wrote:
>
>> Hi,
>>
>> I'm getting my hands on hadoop. One thing I really want to know is how
>> you launch MR jobs in a development environment.
>>
>> I'm currently using Eclipse 3.7 with hadoop plugin from hadoop 1.0.2.
>> With this plugin I can manage HDFS and submit job to cluster. But the
>> strange thing is, every job launch from Eclipse in this way is not recorded
>> by the jobtracker (can't monitor it from web UI). But finally the output
>> appears in HDFS path as the parameter I gave. It's really strange that
>> makes me think it's a standalone job run then it writes output to HDFS.
>>
>> So how do you code and launch jobs to cluster?
>>
>> Many thanks.
>>
>> --
>> *JU Han*
>>
>> UTC   -  Université de Technologie de Compiègne
>> *     **GI06 - Fouille de Données et Décisionnel*
>>
>> +33 0619608888
>>
>
>