Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase, mail # user - Using the Hadoop bundled in the lib directory of HBase


Copy link to this message
-
Re: Using the Hadoop bundled in the lib directory of HBase
Mike Spreitzer 2011-02-14, 02:40
I do not see a BlockChannel.java in
http://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20-append/
--- nor do I see any references in there to BlockChannel.

Thanks,
Mike Spreitzer
From:   Ryan Rawson <[EMAIL PROTECTED]>
To:     [EMAIL PROTECTED]
Date:   02/13/2011 03:51 PM
Subject:        Re: Using the Hadoop bundled in the lib directory of HBase

On Sun, Feb 13, 2011 at 8:29 AM, Mike Spreitzer <[EMAIL PROTECTED]>
wrote:
> Yes, I simply took the Hadoop 0.20.2 release, deleted its
hadoop-core.jar,
> and replaced it with the contents of
> lib/hadoop-core-0.20-append-r1056497.jar from hbase.
>
> I'm not sure what to do with "this approach might work".  How can I know
> if it really does?

I'm not sure, maybe it'll great until one day in a month everything
will crash and burn due to <thing no one could have guessed>.  Perhaps
someone with extensive hdfs code experience might be able to tell you.

>
> BTW, I see that HBase's lib/hadoop-core-0.20-append-r1056497.jar
contains
> org/apache/hadoop/hdfs/server/datanode/BlockChannel.class but I am
having
> trouble figuring out why.  From where in SVN does that come?

Is it not in the append-20-branch ?
>
> Thanks,
> Mike Spreitzer
>
>
>
>
> From:   Ryan Rawson <[EMAIL PROTECTED]>
> To:     [EMAIL PROTECTED]
> Cc:     stack <[EMAIL PROTECTED]>
> Date:   02/13/2011 02:33 AM
> Subject:        Re: Using the Hadoop bundled in the lib directory of
HBase
>
>
>
> If you are taking the jar that we ship and slamming it in a hadoop
> 0.20.2 based distro that might work.  I'm not sure if there are any
> differences than pure code (which would then be expressed in the jar
> only), so this approach might work.
>
> You could also check out to the revision that we built our JAR and
> trying that. By default you need apache forrest (argh) and java5 to
> build hadoop (ARGH) which makes it not buildable on OSX.
>
> Building sucks, there are no short cuts. Good luck out there!
> -ryan
>
> On Sat, Feb 12, 2011 at 11:24 PM, Mike Spreitzer <[EMAIL PROTECTED]>
> wrote:
>> Let me be clear about the amount of testing I did: extremely little.  I
>> should also point out that at first I did not appreciate fully the
> meaning
>> of you earlier comment to Vijay saying "this is a little off" --- I now
>> realize you were in fact saying that Vijay told me to do things
> backward.
>>
>> Since my note saying the backward approach worked, two things have
>> happened: (1) someone make a link to it from (
>> http://hbase.apache.org/notsoquick.html), and (2) Ryan Rowson replied
>> saying, in no uncertain terms, that the backward approach is
unreliable.
> I
>> would not have noticed a reliability issue in the negligible testing I
>> did.
>>
>> Having gotten two opposite opinions, I am now unsure of the truth of
the
>> matter.  Is there any chance of Vijay and Ryan agreeing?
>>
>> Thanks,
>> Mike Spreitzer
>> SMTP: [EMAIL PROTECTED], Lotus Notes: Mike Spreitzer/Watson/IBM
>> Office phone: +1-914-784-6424 (IBM T/L 863-)
>> AOL Instant Messaging: M1k3Sprtzr
>>
>
>