I compiled hadoop-2.2.0-src from scratch for x64 and put the resulting .so in hadoop/lib/native/. I also compiled snappy from scratch and put it there. In a different approach I installed snappy via sudo apt-get and then linked the resulting .so to hadoop/lib/native/libsnappy.so, still no luck.
What is going on here? Why won't Hadoop find my native libraries? Is there any log where I can check what went wrong during loading?
Best regards Yves -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.12 (GNU/Linux) Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/
Thanks... after looking at the sourcecode of the native library loading process and activating the DEBUG messages I found this in the "hadoop.log":
2014-02-11 14:01:14,084 DEBUG org.apache.hadoop.util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /hadoop-2.2.0/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /hadoop-2.2.0/lib/native/libhadoop.so.1.0.0)
Is it possible to compile the native library with glibc 2.12, because at the moment I am not able to upgrade glibc on the servers which run Hadoop.
Best regards Yves
On 11.02.2014 13:29, Ted Yu wrote: -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.12 (GNU/Linux) Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/
Where do you compile your libhadoop.so.1.0.0? It is more like that you compiled libhadoop.so.1.0.0 in a environment with glibc 2.14, but tried to use it in an environment only have glibc 2.12. If you are using a hadoop compiled by yourself, then it is best to compile in an environment matching with your production. Yong
All projects made searchable here are trademarks of the Apache Software Foundation.
Service operated by Sematext