I am Yoonmin from South Korea.
I have a question for you about hadoop classpath.
My hadoop home folder is same in all nodes in my cluster :
Then, I compiled my own modified version of hadoop, so I have a file named
Using scp, I distributed that file to all of my nodes at
/home/hadoop/hadoop-1.2.1 and /home/hadoop/hadoop-1.2.1/build.
Then, I restart dfs and mapred. But changed things are not effected when I
submitted new job into my cluster.
I think it was strange. So, I delete hadoop-core-1.2.2.SNAPSHOP.jar from two
folders as I mentioned.
However, the dfs and mapred are started successfully!
What's going on in my hadoop?
I think it was strongly related to the classpath, so I am asking you what
From: Harsh J [mailto:[EMAIL PROTECTED]]
Sent: Tuesday, January 28, 2014 7:20 AM
To: [EMAIL PROTECTED]
Subject: Re: how to set org.apache.hadoop classpath?
The imports referenced in your error come from the hadoop-common jar, and
won't be present in the hadoop-mapreduce-client-core jar.
The most ideal way to set your compile classpath would be to rely on the
"hadoop classpath" command. Provided $HADOOP_PREFIX/bin/ is on your $PATH,
do the below:
export CLASSPATH=$(hadoop classpath)
On Mon, Jan 27, 2014 at 1:43 PM, EdwardKing <[EMAIL PROTECTED]> wrote:
still raise above error? Where is wrong?
error,please immediately notify the sender by return e-mail, and delete the
original message and all copies from your system. Thank you.