Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> org.apache.hadoop.hbase.mapreduce.Export fails with an NPE

Copy link to this message
Re: org.apache.hadoop.hbase.mapreduce.Export fails with an NPE
I logged HADOOP-6695

Sent from my Verizon Wireless BlackBerry

-----Original Message-----
From: George Stathis <[EMAIL PROTECTED]>
Date: Sat, 10 Apr 2010 12:11:37
Subject: Re: org.apache.hadoop.hbase.mapreduce.Export fails with an NPE

OK, the issue remains in our Ubuntu EC2 dev environment, so it's not just my
local setup. Here are some more observations based on some tests I just ran:

   - If the zookeeper JAR is omitted from HADOOP_CLASSPATH, then there are
   ClassNotFoundExceptions thrown as would be expected
   - If the zookeeper JAR is included in HADOOP_CLASSPATH,
   the ClassNotFoundExceptions go away, but then the original NPE
   re-appears: org.apache.hadoop.hbase.mapreduce.TableInputFormatBase$TableRecordReader.restart(TableInputFormatBase.java:110)
   - If the zookeeper JAR in physically included in $HADOOP_HOME/lib, then
   the NPE goes away as well

So, while it seems that the HADOOP_CLASSPATH is indeed being read, something
is missing during the MapRed process that keeps the htable from being
instantiated properly in TableInputFormatBase unless some JARs are
physically present in $HADOOP_HOME/lib. Note that this issue is not specific
to the zookeeper JAR either. We have enabled the transactional contrib
indexed tables and we have the same problem if we don't physically
include hbase-transactional-0.20.3.jar in the hadoop lib even though it's

It feels like there is a discrepancy in the way classloading is done between
the various components. But I'm not sure whether this is even an HBase issue
and not a Hadoop one. Seems like this might be a JIRA ticket candidate. Any
thoughts on which project should look at this first?


On Fri, Apr 9, 2010 at 8:29 PM, George Stathis <[EMAIL PROTECTED]> wrote:

> Here is mine:
> export
> HADOOP_CLASSPATH="$HBASE_HOME/hbase-0.20.3.jar:$HBASE_HOME/hbase-0.20.3-test.jar:$HBASE_HOME/lib/zookeeper-3.2.2.jar:$HBASE_HOME/conf"
> $HBASE_HOME is defined in my .bash_profile, so it's already there and I see
> it expanded in the debug statements with the correct path. I even tried
> hard-coding the $HBASE_HOME path above just in case and I had the same
> issue.
> I any case, I'm passed it now. I'll have to check whether the same issue
> happens on our dev environment running on Ubuntu on EC2. If not, then at
> least it's localized to my OSX environment.
> -GS
> On Fri, Apr 9, 2010 at 7:32 PM, Stack <[EMAIL PROTECTED]> wrote:
>> Very odd.  I don't have to do that running MR jobs.  I wonder whats
>> different? (I'm using 0.20.4 near-candidate rather than 0.20.3,
>> 1.6.0u14).  I have a HADOOP_ENV like this.
>> export HBASE_HOME=/home/hadoop/0.20
>> export HBASE_VERSION=20.4-dev
>> #export
>> HADOOP_CLASSPATH="$HBASE_HOME/conf:$HBASE_HOME/build/hbase-0.20.4-dev.jar:$HBASE_HOME/build/hbase-0.20.4-dev-test.jar:$HBASE_HOME/lib/zookeeper-3.2.2.jar"
>> export
>> HADOOP_CLASSPATH="$HBASE_HOME/conf:$HBASE_HOME/build/hbase-0.${HBASE_VERSION}.jar:$HBASE_HOME/build/hbase-0.${HBASE_VERSION}-test.jar:$HBASE_HOME/lib/zookeeper-3.2.2.jar"
>> St.Ack
>> On Fri, Apr 9, 2010 at 4:19 PM, George Stathis <[EMAIL PROTECTED]>
>> wrote:
>> > Solved: for those interested, I had to explicitly copy
>> zookeeper-3.2.2.jar
>> > to $HADOOP_HOME/lib even though I had added its' path to
>> > under $HADOOP_HOME/conf/hadoop-env.sh.
>> >
>> > It makes no sense to me why that particular JAR would not get picked up.
>> It
>> > was even listed in the classpath debug output when I ran the job using
>> the
>> > hadoop shell script. If anyone can enlighten, please do.
>> >
>> > -GS
>> >
>> > On Fri, Apr 9, 2010 at 5:56 PM, George Stathis <[EMAIL PROTECTED]>
>> wrote:
>> >
>> >> No dice. Classpath is now set. Same error. Meanwhile, I'm running "$
>> hadoop
>> >> org.apache.hadoop.hbase.PerformanceEvaluation sequentialWrite 1" just
>> fine,
>> >> so MapRed is working at least.
>> >>
>> >> Still looking for suggestions then I guess.