-Re: loading Hadoop native libraries in HBase unit tests
Todd Lipcon 2012-02-13, 19:41
On Mon, Feb 13, 2012 at 11:21 AM, Mikhail Bautin
<[EMAIL PROTECTED]> wrote:
> Would the following work as a complete solution for any platform? We can
> make this conditional on a new Maven profile.
> - Download the sources of the Hadoop version being used
> - Run "ant compile-native"
> - Add the directory of libhadoop.so to java.library.path in the test JVM
Sort of, except the compilation process differs based on the version -
eg newer versions use Maven to build instead.
> On Mon, Feb 13, 2012 at 11:15 AM, Todd Lipcon <[EMAIL PROTECTED]> wrote:
>> Also keep in mind it's not just the hadoop version, but also the glibc
>> version and host architecture. We'd have to publish built binaries for
>> all combinations of architecture*hadoopVersion*glibcVersion
>> Maybe we should just get a copy of _one_ of these versions on the
>> hudson build boxes, and have a new hudson job which runs whichever
>> tests depend on the native code there?
>> On Mon, Feb 13, 2012 at 10:52 AM, Roman Shaposhnik <[EMAIL PROTECTED]> wrote:
>> > On Mon, Feb 13, 2012 at 1:58 AM, Mikhail Bautin
>> > <[EMAIL PROTECTED]> wrote:
>> >> Then how about solving the issue for the most common case (the default
>> >> version of Hadoop)? We can import the default version of libhadoop.so
>> >> the HBase codebase and load it in tests, as I mentioned. This can be
>> >> considered a hack but will definitely increase the test coverage.
>> > You're not proposing importing a native binary into a source tree, are
>> > That won't be very reliable at all.
>> > We can probably come up with a # of workaroudns here, but at the end
>> > of the day, unless you recompiled the native bits here and now, chances
>> > are they won't be compatible with the OS you happen to be on.
>> > Thanks,
>> > Roman.
>> Todd Lipcon
>> Software Engineer, Cloudera
Software Engineer, Cloudera