Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Pig >> mail # user >> pig hbase; java.lang.ClassNotFoundException: com.google.protobuf.Message


Copy link to this message
-
Re: pig hbase; java.lang.ClassNotFoundException: com.google.protobuf.Message
Thank you very much. The 'REGISTER' has worked for me. I gave the full path
of the jar file in 'REGISTER' and the program ran successfully.

Thanks again,
Kiran.
On Sat, Feb 9, 2013 at 1:20 AM, Ramakrishna Nalam <[EMAIL PROTECTED]>wrote:

> Also 'REGISTER' the jars that'll be used in the MR tasks in the script.
> Else add them 'pig.additional.jars' in pig.properties.
>
> Regards,
> Rama.
>
>
>
> On Sat, Feb 9, 2013 at 7:16 AM, kiran chitturi <[EMAIL PROTECTED]
> >wrote:
>
> > Thanks for the reply!
> >
> > Yes. I did add hbase dependencies to the PIG_CLASSPATH. I am using the
> > below lines in my pig script.
> >
> >        field = LOAD 'hbase://documents' USING
> > org.apache.pig.backend.hadoop.hbase.HBaseStorage('info:collection',
> > '-loadKey false') as (fieldOL);
> >        store field into 'results/extract' using PigStorage(';');
> >
> > Thanks,
> > Kiran.
> >
> >
> >
> > On Fri, Feb 8, 2013 at 8:44 PM, Harsha <[EMAIL PROTECTED]> wrote:
> >
> > > kiran,
> > >     if you are trying to access protobuf from inside pig script put the
> > > jar in PIG_CLASSPATH.
> > >
> > > --
> > > Harsha
> > >
> > >
> > > On Friday, February 8, 2013 at 4:56 PM, kiran chitturi wrote:
> > >
> > > > Hi!
> > > >
> > > > I am trying to use pig and hbase but i keep running in to
> > > > classNotFoundException error. I have tried few things but they have
> > never
> > > > worked.
> > > >
> > > > I am using pig 0.10.1 and hbase 0.94.1, hadoop 1.0.4. I have updated
> my
> > > > HADOOP_CLASSPATH in hadoop-env.sh (http://hadoop-env.sh) as per this
> > > post [0]
> > > >
> > > > After updating my classpath, when i do the command
> > > '/opt/hadoop/bin/hadoop
> > > > classpath' i could see the protobuf jar file in it.
> > > >
> > > > Yet, when i run the pig script that loads from hadoop class, i keep
> > > getting
> > > > this error in the map tasks (Error: java.lang.ClassNotFoundException:
> > > > com.google.protobuf.Message).
> > > >
> > > > Every data node has the protobuf jar file in its hadoop classpath. I
> > have
> > > > also tried adding the jar file like this (/opt/pig-0.10.1/bin/pig
> > > > /opt/pig_programs/testHbase.pig
> > > > -Dpig.additional.jars=/opt/hbase-0.94.1/lib/protobuf-java-2.4.0a.jar
> ).
> > > >
> > > > I keep running in to this error.
> > > >
> > > > Can anyone please let me know how to solve this issue ?
> > > >
> > > >
> > > > Many Thanks,
> > > > Kiran.
> > > >
> > > > [0] -
> > > >
> > >
> >
> http://mail-archives.apache.org/mod_mbox/pig-user/201211.mbox/%3CCANBTPCHb5+kFyew+[EMAIL PROTECTED]%3E
> > > >
> > > > --
> > > > Kiran Chitturi
> > > >
> > > >
> > >
> > >
> > >
> >
> >
> > --
> > Kiran Chitturi
> >
>

--
Kiran Chitturi
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB