Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce >> mail # user >> Job launch from eclipse


+
Han JU 2013-04-23, 13:21
+
shashwat shriparv 2013-04-23, 13:26
+
Mohammad Tariq 2013-04-23, 13:46
Copy link to this message
-
Re: Job launch from eclipse
Thanks Shashwat and Mohammad.
I'm exporting jars and run that with hadoop jar, but I think we should have
better ways.
I've tried a lot but launch in Eclipse just doesn't work. I don't really
want to hard code jobtracker or hdfs information in my code.
Maybe it's a bug in hadoop eclipse plugin? I'm using that of hadoop 1.0.2,
do we have newer versions?

Thanks.
2013/4/23 Mohammad Tariq <[EMAIL PROTECTED]>

> Hell Han,
>
>       The reason behind this is that the jobs are running inside the
> Eclipse itself and not getting submitted to your cluster. Please see if
> this links helps :
> http://cloudfront.blogspot.in/2013/03/mapreduce-jobs-running-through-eclipse.html#.UXaQsDWH6IQ
>
>
> Warm Regards,
> Tariq
> https://mtariq.jux.com/
> cloudfront.blogspot.com
>
>
> On Tue, Apr 23, 2013 at 6:56 PM, shashwat shriparv <
> [EMAIL PROTECTED]> wrote:
>
>> You need to generate a jar file, pass all the parameters on run time if
>> any is fixed and run at hadoop like hadoop -jar jarfilename.jar <parameters>
>>
>>  *Thanks & Regards    *
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>> On Tue, Apr 23, 2013 at 6:51 PM, Han JU <[EMAIL PROTECTED]> wrote:
>>
>>> Hi,
>>>
>>> I'm getting my hands on hadoop. One thing I really want to know is how
>>> you launch MR jobs in a development environment.
>>>
>>> I'm currently using Eclipse 3.7 with hadoop plugin from hadoop 1.0.2.
>>> With this plugin I can manage HDFS and submit job to cluster. But the
>>> strange thing is, every job launch from Eclipse in this way is not recorded
>>> by the jobtracker (can't monitor it from web UI). But finally the output
>>> appears in HDFS path as the parameter I gave. It's really strange that
>>> makes me think it's a standalone job run then it writes output to HDFS.
>>>
>>> So how do you code and launch jobs to cluster?
>>>
>>> Many thanks.
>>>
>>> --
>>> *JU Han*
>>>
>>> UTC   -  Université de Technologie de Compiègne
>>> *     **GI06 - Fouille de Données et Décisionnel*
>>>
>>> +33 0619608888
>>>
>>
>>
>
--
*JU Han*

Software Engineer Intern @ KXEN Inc.
UTC   -  Université de Technologie de Compiègne
*     **GI06 - Fouille de Données et Décisionnel*

+33 0619608888
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB