Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce, mail # user - Issue with running my first hadoop program using eclipse


Copy link to this message
-
Re: Issue with running my first hadoop program using eclipse
Mohammad Tariq 2013-05-10, 11:48
Hello Ramya,

   You might find this
JIRA<https://issues.apache.org/jira/browse/HADOOP-7682?focusedCommentId=13236645#comment-13236645>useful
which talks about the same problem. And for configuring the eclipse
plugin you could visit
this<http://cloudfront.blogspot.in/2012/07/how-to-run-mapreduce-programs-using.html#.UYzeTkAW38s>
page.
I have tries to explain the procedure. Let me know if you face any problem.

Warm Regards,
Tariq
cloudfront.blogspot.com
On Fri, May 10, 2013 at 4:32 PM, Ramya S <[EMAIL PROTECTED]> wrote:

> Hi,
>
> So it is not due to the problem of  map/reduce plugin, rt?
> Then it will be due to error in configuring map/reduce plugin na?
> So can you explain it with necessary steps?
>
>
> ________________________________
>
> From: Geelong Yao [mailto:[EMAIL PROTECTED]]
> Sent: Fri 5/10/2013 4:27 PM
> To: [EMAIL PROTECTED]
> Subject: Re: Issue with running my first hadoop program using eclipse
>
>
> I think the main problem maybe the permission of your tmp directory.
>
>
> 2013/5/10 Ramya S <[EMAIL PROTECTED]>
>
>
>         Hi,
>         I am new to hadoop,so please help me to tackle the following issue.
>
>         I have Installed Apache Hadoop Pseudo Distributed Mode on a Single
> Node with hadoop-1.0.4.It <http://hadoop-1.0.4.it/>  works fine and i
> tried some examples of wordcount on putty too. But getting into eclipse IDE
> on windows i was struck up with the first program. I have already installed
> jdk6 and mapreduce plugin version1.0.4 also. When trying to run on hadoop
> error occurs as follows:-
>
>
>
>
>         *
>                 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
>         *
>                 ERROR security.UserGroupInformation:
>
>         PriviledgedActionException as:ramyas cause:java.io.IOException:
> Failed to set permissions of path:
> \tmp\hadoop-ramyas\mapred\staging\ramyas-1375395355\.staging to 0700
>
>         *
>         java.io.IOException
>         : Failed to set permissions of path:
> \tmp\hadoop-ramyas\mapred\staging\ramyas-1375395355\.staging to 0700
>         *
>         at org.apache.hadoop.fs.FileUtil.checkReturnValue(
>         FileUtil.java:689)
>         *
>         at org.apache.hadoop.fs.FileUtil.setPermission(
>         FileUtil.java:662)
>         *
>         at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(
>         RawLocalFileSystem.java:509)
>         *
>         at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(
>         RawLocalFileSystem.java:344)
>         *
>         at org.apache.hadoop.fs.FilterFileSystem.mkdirs(
>         FilterFileSystem.java:189)
>         *
>         at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(
>         JobSubmissionFiles.java:116)
>         *
>         at org.apache.hadoop.mapred.JobClient$2.run(
>         JobClient.java:856)
>         *
>         at org.apache.hadoop.mapred.JobClient$2.run(
>         JobClient.java:850)
>         *
>         at java.security.AccessController.doPrivileged(
>         Native Method)
>         *
>         at javax.security.auth.Subject.doAs(Unknown Source)
>         *
>         at org.apache.hadoop.security.UserGroupInformation.doAs(
>         UserGroupInformation.java:1121)
>         *
>         at org.apache.hadoop.mapred.JobClient.submitJobInternal(
>         JobClient.java:850)
>         *
>         at org.apache.hadoop.mapred.JobClient.submitJob(
>         JobClient.java:824)
>         *
>         at org.apache.hadoop.mapred.JobClient.runJob(
>         JobClient.java:1261)
>         *
>         at WordCount.main(
>         WordCount.java:29)
>         *       Also I want to know about configuring map/reduce plugin in
> eclipse.
>
>         Thanks and Regards
>
>         Ramya.S
>
>
>
>
> --
> From Good To Great
>