-Re: ClassNotFoundException when loading a hbase table in mapreduce mode
I figured I need to put a lot of hbase jars or jars hbase depend on
through -D option. Instead, I put all the hbase lib's jar to
HADOOP_CLASSPATH. It worked. I got the idea from this page.
had trouble to follow exact steps. But I put the following at the end of
hadoop-env.sh to make hadoop aware of hbase:
*for f in $HBASE_HOME/lib/*.jar; do*
On Wed, Nov 21, 2012 at 7:56 PM, Cheolsoo Park <[EMAIL PROTECTED]> wrote:
> Hi Jack,
> PIG_CLASSPATH doesn't ship jar files to the back-end, but it only adds them
> to classpath in the front-end.
> java.lang.ClassNotFoundException: com.google.protobuf.Message
> You should make protobuf available in mappers in the back-end. Please try
> to pass it via -Dpig.additional.jars=<path to protobuf jar> in your Pig
> command. This will add the protobuf jar to distributed cache as well as
> classpath in mappers.
> On Tue, Nov 20, 2012 at 1:48 PM, Jinyuan Zhou <[EMAIL PROTECTED]
> > Hi,
> > I am using org.apache.pig.backend.hadoop.hbase.HBaseStorage to load from
> > hbase table in pig. it works in local mode. But when I was trying do it
> > mapreduce mode. The mappers got the CalssNotFoundException.
> > [main] ERROR org.apache.pig.tools.grunt.GruntParser - ERROR 2997: Unable
> > recreate exception from backed error: Error:
> > java.lang.ClassNotFoundException: com.google.protobuf.Message
> > I installed hadoop, hbase, hive through brew on my mac osx mountain lion.
> > added hbase jar into PIG_CLASSPATH to be able to read load from hbase
> > i local mode. Seems I am still missing some
> > Thanks,
> > Jack
-- Jinyuan (Jack) Zhou