Thanks for your reply. I have tried by the pig 0.9.1. It's no a problem anymore.
On Wed, Jan 18, 2012 at 1:49 PM, Dmitriy Ryaboy <[EMAIL PROTECTED]> wrote:
> Right, the classpath is not set right -- nothing about your hadoop
> environment is there, and the fat jar (the one that bundles hadoop) is on
> the classpath. Pig 9 is better about figuring this stuff out, especially if
> you install the rpm.
> Try modifying the command you see as the result of -secretDebugCmd to *not*
> include the snapshot jar, and to include your actual hadoop config
> directory and all the actual hadoop jars.
> On Wed, Jan 18, 2012 at 4:32 AM, yonghu <[EMAIL PROTECTED]> wrote:
>> If I rebuild the pig using ant jar-withouthadoop, the output of pig
>> -secretDebugCmd is following:
>> dry run:
>> /usr/lib/jvm/java-6-sun/bin/java -Xmx1000m
>> -Dpig.log.file=pig.log -Dpig.home.dir=/home/huyong/pig-0.8.1/bin/..
>> -Dpig.root.logger=INFO,console,DRFA -classpath
>> as you see, there is no hadoop information.
>> On Wed, Jan 18, 2012 at 1:05 PM, Dmitriy Ryaboy <[EMAIL PROTECTED]>
>> > You need to make sure the hadoop jar being used at runtime is the exact
>> > same version as the version of hadoop you are using.
>> > What is the output of "pig -secretDebugCmd"? Does the hadoop jar in that
>> > classpath match that you use to start hadoop?
>> > -D
>> > On Wed, Jan 18, 2012 at 2:03 AM, yonghu <[EMAIL PROTECTED]> wrote:
>> >> Hello,
>> >> My pig version is 0.8.1. I have got some information of the mailing
>> >> I rebuilt the pig using:
>> >> ant jar-withouthadoop
>> >> and replace the hadoop jar file in /pig_home/build/ivy/lib/pig with
>> >> the hadoop-append jar. At last, I export HADOOP_HOME and PIG_HOME.
>> >> But when I started the pig, I always got the information that the hdfs
>> >> version of pig and hadoop are not compatible.
>> >> Can anyone tell me how to solve this problem?
>> >> Thanks
>> >> Yong