Thanks Shashwat and Mohammad.
I'm exporting jars and run that with hadoop jar, but I think we should have
I've tried a lot but launch in Eclipse just doesn't work. I don't really
want to hard code jobtracker or hdfs information in my code.
Maybe it's a bug in hadoop eclipse plugin? I'm using that of hadoop 1.0.2,
do we have newer versions?
2013/4/23 Mohammad Tariq <[EMAIL PROTECTED]>
> Hell Han,
> The reason behind this is that the jobs are running inside the
> Eclipse itself and not getting submitted to your cluster. Please see if
> this links helps :
> Warm Regards,
> On Tue, Apr 23, 2013 at 6:56 PM, shashwat shriparv <
> [EMAIL PROTECTED]> wrote:
>> You need to generate a jar file, pass all the parameters on run time if
>> any is fixed and run at hadoop like hadoop -jar jarfilename.jar <parameters>
>> *Thanks & Regards *
>> Shashwat Shriparv
>> On Tue, Apr 23, 2013 at 6:51 PM, Han JU <[EMAIL PROTECTED]> wrote:
>>> I'm getting my hands on hadoop. One thing I really want to know is how
>>> you launch MR jobs in a development environment.
>>> I'm currently using Eclipse 3.7 with hadoop plugin from hadoop 1.0.2.
>>> With this plugin I can manage HDFS and submit job to cluster. But the
>>> strange thing is, every job launch from Eclipse in this way is not recorded
>>> by the jobtracker (can't monitor it from web UI). But finally the output
>>> appears in HDFS path as the parameter I gave. It's really strange that
>>> makes me think it's a standalone job run then it writes output to HDFS.
>>> So how do you code and launch jobs to cluster?
>>> Many thanks.
>>> *JU Han*
>>> UTC - Université de Technologie de Compiègne
>>> * **GI06 - Fouille de Données et Décisionnel*
>>> +33 0619608888
Software Engineer Intern @ KXEN Inc.
UTC - Université de Technologie de Compiègne
* **GI06 - Fouille de Données et Décisionnel*