Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Pig >> mail # user >> How to make pig works with hadoop-0.20-append


Copy link to this message
-
Re: How to make pig works with hadoop-0.20-append
I tried doing that..but same error.

here is output of the folder after replacing jar files

hadoop@ub13:/usr/local/pig/build/ivy/lib/Pig$ ls
ant-1.6.5.jar                 ftpserver-deprecated-1.0.0-M2.jar
javacc-4.2.jar          jython-2.5.0.jar
commons-cli-1.2.jar           guava-r06.jar
javacc.jar              kfs-0.3.jar
commons-codec-1.3.jar         hadoop-0.20-append-for-hbase-core.jar
jets3t-0.7.1.jar        log4j-1.2.14.jar
commons-el-1.0.jar            hadoop-0.20-append-for-hbase-test.jar
jetty-6.1.14.jar        mina-core-2.0.0-M5.jar
commons-httpclient-3.0.1.jar  hbase-0.90.0.jar
jetty-util-6.1.14.jar   oro-2.0.8.jar
commons-lang-2.4.jar          hbase-0.90.0-tests.jar
jline-0.9.94.jar        servlet-api-2.5-6.1.14.jar
commons-logging-1.1.1.jar     hsqldb-1.8.0.10.jar
joda-time-1.6.jar       slf4j-api-1.5.2.jar
commons-net-1.4.1.jar         jackson-core-asl-1.0.1.jar
jsch-0.1.38.jar         slf4j-log4j12-1.4.3.jar
core-3.1.1.jar                jackson-mapper-asl-1.0.1.jar
jsp-2.1-6.1.14.jar      xmlenc-0.52.jar
ftplet-api-1.0.0.jar          jasper-compiler-5.5.12.jar
jsp-api-2.1-6.1.14.jar  zookeeper-3.3.3.jar
ftpserver-core-1.0.0.jar      jasper-runtime-5.5.12.jar
junit-4.5.jar
Its after replacing the hadoop-0.20-append-for-hbase jar files.

Still its not working :-(

Thanks,
Praveenesh

On Mon, Jul 4, 2011 at 11:27 AM, Daniel Dai <[EMAIL PROTECTED]> wrote:

> One way to make it work is to replace
> build/ivy/lib/Pig/hadoop-core-0.20.2.jar with
> build/hadoop-core-0.20.3-SNAPSHOT.jar from hadoop append.
>
> Daniel
>
> On Mon, Jul 4, 2011 at 12:38 AM, praveenesh kumar <[EMAIL PROTECTED]
> >wrote:
>
> > Hello people,
> > I am new to pig.
> > Currently I am using hadoop and hbase together.
> > Since hadoop-0.20-append supports Hbase in production, so currently I am
> > using hadoop 0.20-append jar files.
> >
> > Now I am interested to use pig which supports 0.20-append version.
> >
> > I am trying to use pig 0.8, but it seems not to be working.
> > Whenever I am trying to run pig in map-reduce mode, it is giving me
> > Error:2999.
> > Here is output of my log file.
> >
> > hadoop@ub13:/usr/local/pig/bin$ pig
> >
> > 2011-07-01 17:41:52,150 [main] INFO  org.apache.pig.Main - Logging error
> > messages to: /usr/local/pig/bin/pig_1309522312144.log
> >
> > 2011-07-01 17:41:52,454 [main] INFO
> > org.apache.pig.backend.hadoop.executionengine.HExecutionEngine -
> Connecting
> > to hadoop file system at: hdfs://ub13:54310
> >
> > 2011-07-01 17:41:52,654 [main] ERROR org.apache.pig.Main - ERROR 2999:
> > Unexpected internal error. Failed to create DataStorage
> >
> > LOG MESSAGE -----
> >
> > Error before Pig is launched---------------------------
> >
> > ERROR 2999: Unexpected internal error. Failed to create DataStorage
> >
> > java.lang.RuntimeException: Failed to create DataStorage
> >
> > at
> >
> >
> org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> >
> > at
> >
> >
> org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> >
> > at
> >
> >
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> >
> > at
> >
> >
> org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> >
> > at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> >
> > at org.apache.pig.PigServer.<init>(PigServer.java:226)
> >
> > at org.apache.pig.PigServer.<init>(PigServer.java:215)
> >
> > at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> >
> > at org.apache.pig.Main.run(Main.java:452)
> >
> > at org.apache.pig.Main.main(Main.java:107)
> >
> > Caused by: org.apache.hadoop.ipc.RPC$VersionMismatch: Protocol
> > org.apache.hadoop.hdfs.protocol.ClientProtocol version mismatch. (client
> > > 41, server = 43)
> >
> > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:364)
> >
> > at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> >
> > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)