Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> Re: java.lang.UnsatisfiedLinkError - Unable to load libGfarmFSNative library


Copy link to this message
-
Re: java.lang.UnsatisfiedLinkError - Unable to load libGfarmFSNative library
Is "libgfarm.so.1" installed and available on all systems? You're facing a
link error though hadoop did try to load the library it had (
libGfarmFSNative.so). If the "gfarm" guys have a mailing list, thats
probably the best place to ask.
On Thu, Jun 27, 2013 at 1:06 AM, Marília Melo <[EMAIL PROTECTED]> wrote:

> Hi all,
>
> I'm trying to install a plugin called gfarm_hadoop that allows me to use a
> filesystem called gfarm instead of HDFS (
> https://sourceforge.net/projects/gfarm/files/gfarm_hadoop/).
>
> I have used it before, but now I'm trying to install it in a new cluster
> and for some reason it isn't working...
>
> After installing gfarm 2.5.8 at /data/local3/marilia/gfarm, hadoop 1.1.2
> at /data/local3/marilia/hadoop-1.1.2 and the plugin, when I try to list the
> new filesystem it works fine:
>
> $ bin/hadoop fs -ls gfarm:///
> Found 26 items
> -rwxrwxrwx   1        101 2013-06-26 02:36 /foo
> drwxrwxrwx   -          0 2013-06-26 02:43 /home
>
> But then when I try to run an example, the task eventually completes, but
> I get " Unable to load libGfarmFSNative library" errors. Looking at the
> logs message it seems to be a path problem, but I have tried almost
> everything and it doesn't work.
>
> The way I'm setting the path now is writing on conf/hadoop-env.sh the
> following line:
>
> export LD_LIBRARY_PATH=/data/local3/marilia/gfarm/lib
>
> I have even moved all the .so files to the hadoop directory, but I still
> get the same message...
>
>
> Any ideas?
>
> Thanks in advance.
>
>
> Log:
>
> $ bin/hadoop jar hadoop-examples-*.jar teragen 1000 gfarm:///inoa11
> Generating 1000 using 2 maps with step of 500
> 13/06/27 03:57:32 INFO mapred.JobClient: Running job: job_201306270356_0001
> 13/06/27 03:57:33 INFO mapred.JobClient:  map 0% reduce 0%
> 13/06/27 03:57:38 INFO mapred.JobClient:  map 50% reduce 0%
> 13/06/27 03:57:43 INFO mapred.JobClient: Task Id :
> attempt_201306270356_0001_m_000001_0, Status : FAILED
> java.lang.Throwable: Child Error
>        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> java.lang.Throwable: Child Error
>        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201306270356_0001_m_000001_0: java.lang.UnsatisfiedLinkError:
> /data/local3/marilia/hadoop-1.1.2/lib/native/Linux-amd64-64/libGfarmFSNative.so:
> libgfarm.so.1: cannot open shared object file: No such file or directory
> attempt_201306270356_0001_m_000001_0:   at
> java.lang.ClassLoader$NativeLibrary.load(Native Method)
> attempt_201306270356_0001_m_000001_0:   at
> java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1803)
>  attempt_201306270356_0001_m_000001_0:   at
> java.lang.ClassLoader.loadLibrary(ClassLoader.java:1728)
> attempt_201306270356_0001_m_000001_0:   at
> java.lang.Runtime.loadLibrary0(Runtime.java:823)
> attempt_201306270356_0001_m_000001_0:   at
> java.lang.System.loadLibrary(System.java:1028)
> attempt_201306270356_0001_m_000001_0:   at
> org.apache.hadoop.fs.gfarmfs.GfarmFSNative.<clinit>(GfarmFSNative.java:9)
> attempt_201306270356_0001_m_000001_0:   at
> org.apache.hadoop.fs.gfarmfs.GfarmFileSystem.initialize(GfarmFileSystem.java:34)
> attempt_201306270356_0001_m_000001_0:   at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1411)
> attempt_201306270356_0001_m_000001_0:   at
> org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> attempt_201306270356_0001_m_000001_0:   at
> org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1429)
> attempt_201306270356_0001_m_000001_0:   at
> org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
> attempt_201306270356_0001_m_000001_0:   at
> org.apache.hadoop.fs.Path.getFileSystem(Path.java:187)

Harsh J