Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # dev >> how to set org.apache.hadoop classpath?


Copy link to this message
-
Re: how to set org.apache.hadoop classpath?
Try this

bash$ export CLASSPATH="path/to/jar/file:path/tojar/file2" bash$ javac MyMainClass.java

Sent from my iPhone

> On Jan 27, 2014, at 2:04 AM, "EdwardKing" <[EMAIL PROTECTED]> wrote:
>
> I am newbies of Hadoop, how to build libs in my classpath, would you give me details command?
> Thanks
>
> ----- Original Message -----
> From: "Siddharth Tiwari" <[EMAIL PROTECTED]>
> To: <[EMAIL PROTECTED]>
> Sent: Monday, January 27, 2014 4:56 PM
> Subject: Re: how to set org.apache.hadoop classpath?
>
>
> You need build libs in your classpath. Export it to CLASSPATH variable.
>
> Sent from my iPhone
>
>> On Jan 27, 2014, at 1:41 AM, "EdwardKing" <[EMAIL PROTECTED]> wrote:
>>
>> I have set HADOOP_CLASSPATH,but it still raise error.
>>
>> [hadoop@master ~]$ export HADOOP_CLASSPATH=.:/home/software/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar
>> [hadoop@master ~]$ javac WordCount1.java
>> WordCount1.java:2: error: package org.apache.hadoop.conf does not exist
>> import org.apache.hadoop.conf.Configuration;
>>                            ^
>> WordCount1.java:3: error: package org.apache.hadoop.fs does not exist
>> import org.apache.hadoop.fs.Path;
>>
>> ----- Original Message -----
>> From: "Ravindra" <[EMAIL PROTECTED]>
>> To: <[EMAIL PROTECTED]>
>> Sent: Monday, January 27, 2014 4:24 PM
>> Subject: Re: how to set org.apache.hadoop classpath?
>>
>>
>>> export LIB_JARS=jar1,jar2....
>>> export HADOOP_CLASSPATH=.....
>>>
>>> This should be used for testing etc only....make use of distributed cache
>>> for better performance....
>>>
>>> --
>>> Ravi.
>>> *''We do not inherit the earth from our ancestors, we borrow it from our
>>> children.'' PROTECT IT !*
>>>
>>>
>>>> On Mon, Jan 27, 2014 at 1:43 PM, EdwardKing <[EMAIL PROTECTED]> wrote:
>>>>
>>>> I use hadoop-2.2.0 under CentOS-5.8,then I set JAVA_HOME ,HADOOP_HOME and
>>>> CLASSPATH
>>>>
>>>> [hadoop@master ~]$ echo $JAVA_HOME
>>>> /home/software/jdk1.7.0_02
>>>> [hadoop@master ~]$ echo $HADOOP_HOME
>>>> /home/software/hadoop-2.2.0
>>>> [hadoop@master mapreduce]$ echo $CLASSPATH
>>>> .:/home/software/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:
>>>>
>>>> I write a java file,like follows:
>>>>
>>>> import org.apache.hadoop.conf.Configuration;
>>>> import org.apache.hadoop.fs.Path;
>>>> import org.apache.hadoop.io.IntWritable;
>>>> import org.apache.hadoop.io.Text;
>>>> public class WordCount1{
>>>> ...........
>>>> }
>>>>
>>>> Then I compile this java file,but it raise following errors,like follows:
>>>>
>>>> [hadoop@master ~]$ javac WordCount1.java
>>>> WordCount1.java:2: error: package org.apache.hadoop.conf does not exist
>>>> import org.apache.hadoop.conf.Configuration;
>>>>                            ^
>>>> WordCount1.java:3: error: package org.apache.hadoop.fs does not exist
>>>> import org.apache.hadoop.fs.Path;
>>>>
>>>> I have set hadoop-mapreduce-client-core-2.2.0.jar into CLASSPATH, why it
>>>> still raise above error? Where is wrong?
>>>>
>>>> Thanks.
>>>>
>>>>
>>>> ---------------------------------------------------------------------------------------------------
>>>> Confidentiality Notice: The information contained in this e-mail and any
>>>> accompanying attachment(s)
>>>> is intended only for the use of the intended recipient and may be
>>>> confidential and/or privileged of
>>>> Neusoft Corporation, its subsidiaries and/or its affiliates. If any reader
>>>> of this communication is
>>>> not the intended recipient, unauthorized use, forwarding, printing,
>>>> storing, disclosure or copying
>>>> is strictly prohibited, and may be unlawful.If you have received this
>>>> communication in error,please
>>>> immediately notify the sender by return e-mail, and delete the original
>>>> message and all copies from
>>>> your system. Thank you.
>>>>
>>>> --------------------------------------------------------------------------------------------