William Kang 2013-01-12, 18:58
Gaurav Kumar 2013-01-12, 19:11
William Kang 2013-01-15, 01:51
William Kang 2013-01-20, 03:45
Gaurav Kumar 2013-01-20, 05:13
I looked up in the eclipse's error log, and the errors are listed:
The command ("dfs.browser.action.delete") is undefined
The command ("dfs.browser.action.refresh") is undefined
The command ("dfs.browser.action.upload_dir") is undefined
The command ("dfs.browser.action.upload_files") is undefined
The command ("dfs.browser.action.mkdir") is undefined
The command ("dfs.browser.action.donload") is undefined
Everything is running on Ubuntu 64 12.04 for both the cluster and
The interesting thing is that the HDFS browsing, uploading, etc., all
work fine. And if I chose run the driver as "Java Application" it
works fine in the standalone mode as well. But if I chose to run the
driver "Run on Hadoop", it simply run in the standalone mode and
totally ignoring the cluster.
Thanks for your helps.
On Sun, Jan 20, 2013 at 12:13 AM, Gaurav Kumar <[EMAIL PROTECTED]> wrote:
> Hi William,
> What error/exception ware you getting specifically?
> On which platforms are your clusters and eclipse running?
> Thanks & Regards,
> Gaurav Kumar
> Software Engineer
> HCL Technologies Ltd.
> Mob: +91-9953294125
> Blog: TechnoTurd <http://technoturd.wordpress.com/>
> Connect with me on LinkedIn <http://in.linkedin.com/in/gauravkumar37>
> On Sun, Jan 20, 2013 at 9:15 AM, William Kang <[EMAIL PROTECTED]>wrote:
>> I got the plugin to work fine with browsing the HDFS system and run
>> code in the local mode. But the "Run on Hadoop" button doesn't work at
>> all so I cant deploy the code to the cluster.
>> Any suggestions?
>> Many thanks.
>> On Mon, Jan 14, 2013 at 8:51 PM, William Kang <[EMAIL PROTECTED]>
>> > Hi Gaurav,
>> > Thanks a lot. It worked! The only small incident is that it doesn't
>> > compile correctly with Java7. But it works fine with Java6. Thank you
>> > very much for the instruction.
>> > Cao
>> > On Sat, Jan 12, 2013 at 2:11 PM, Gaurav Kumar <[EMAIL PROTECTED]>
>> >> Hi William,
>> >> The general approach is to
>> >> 1. checkout the code from Apache's SVN
>> >> 2. modify build.properties in /src/contrib/eclipse-plugin and add
>> >> eclipse.home=<path to eclipse>
>> >> 3. download apache forrest 0.8 and sun jdk 5
>> >> 4. run ant command as "ant clean package
>> >> -Djava5.home=/opt/java/jdk1.5.0_22
>> >> (replace the paths as per your config)
>> >> 5. you should be online for this
>> >> 6. after that eclipse plugin should be there in
>> >> /build/contrib/eclipse-plugin
>> >> 7. Now the plugin thus made is not correct
>> >> 8. open the jar and add the jars of the following in /lib of the jar-
>> >> commons-configuration, commons-lang,
>> >> 9. modify MANIFEST.MF in /META-INF of the jar to include these paths
>> >> such as "Bundle-ClassPath:
>> >> 10. copy this jar to plugins folder of eclipse
>> >> 11. run "eclipse -clean"
>> >> 12. switch to map reduce perspective
>> >> 2 noteworthy points:
>> >> 1. it's really not worth it to build the plugin for later versions of
>> >> hadoop because even if the plugin has compiled, it doesn't work with
>> >> the feaures
>> >> 2. this query may have been better answered in the developers mailing
>> >> list.
>> >> Thanks & Regards,
>> >> Gaurav Kumar
>> >> Software Engineer
>> >> HCL Technologies Ltd.
>> >> Mob: +91-9953294125
>> >> Blog: TechnoTurd <http://technoturd.wordpress.com/>
>> >> Connect with me on LinkedIn <http://in.linkedin.com/in/gauravkumar37>
>> >> <http://in.linkedin.com/in/gauravkumar37>
>> >> On Sun, Jan 13, 2013 at 12:28 AM, William Kang <[EMAIL PROTECTED]
yiyu jia 2013-01-21, 00:40