Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Native libraries for multiple architectures?


Copy link to this message
-
Re: Native libraries for multiple architectures?
Would it work if you package your native library under the directory  
of lib/native/<arch>/...?

On Jul 10, 2009, at 12:46 PM, Todd Lipcon wrote:

> Hi Stuart,
>
> Hadoop itself doesn't have any nice way of dealing with this that I  
> know of.
> I think your best bet is to do something like:
>
> String dataModel = System.getProperty("sun.arch.data.model");
> if ("32".equals(dataModel)) {
>  System.loadLibrary("mylib_32bit");
> } elseif ("64".equals(dataModel)) {
>  System.loadLibrary("mylib_64bit");
> } else {
>  throw new RuntimeException("Unknown data model: " +
> String.valueOf(dataModel));
> }
>
> Then include your libraries as libmylib_32bit.so and  
> libmylib_64bit.so in
> the distributed cache.
>
> Hope that helps
> -Todd
>
> On Fri, Jul 10, 2009 at 12:19 PM, Stuart White <[EMAIL PROTECTED]
> >wrote:
>
>> My hadoop cluster is a combination of i386-32bit and amd64-64bit  
>> machines.
>> I have some native code that I need to execute from my mapper.  I  
>> have
>> different native libraries for the different architectures.
>>
>> How can I accomplish this?  I've looked at using -files or  
>> DistributedCache
>> to push the native libraries to the nodes, but I can't figure out  
>> how to
>> make sure I link against the correct native library (for the  
>> architecture
>> the map task is running on).
>>
>> Anyone else run into this?  Any suggestions?
>>