Jean-Marc Spaggiari 2013-02-08, 16:30
Harsh J 2013-02-08, 16:49
I have installed all the dependencies, but I'm trying to build the
amd64 version, and I think it's not working because there is some
hardcoded parameters for 32 bits.
I tried to apply this patch to 1.0.3, but of course it's not working.
To many differences.
I was able to make good progress, but I'm still stuck at ant
compile-contrib -Dlibhdfs=1 -Dfusedfs=1 with:
[exec] gcc -Wall -O3
-lhdfs -L/lib -lfuse -L/usr/local/jdk1.7.0_05//jre/lib/amd64/server
-ljvm -o fuse_dfs fuse_dfs.o fuse_options.o fuse_trash.o
fuse_stat_struct.o fuse_users.o fuse_init.o fuse_connect.o
fuse_impls_access.o fuse_impls_chmod.o fuse_impls_chown.o
fuse_impls_create.o fuse_impls_flush.o fuse_impls_getattr.o
fuse_impls_mkdir.o fuse_impls_mknod.o fuse_impls_open.o
fuse_impls_read.o fuse_impls_release.o fuse_impls_readdir.o
fuse_impls_rename.o fuse_impls_rmdir.o fuse_impls_statfs.o
fuse_impls_symlink.o fuse_impls_truncate.o fuse_impls_utimens.o
[exec] /usr/bin/ld: cannot find -lhdfs
I think it's because the .so is not properly generated at the previous step.
I will give a try to bigtop and see if I have more luck.
2013/2/8, Harsh J <[EMAIL PROTECTED]>:
> I recall having mentioned it before, but why not use the Apache Bigtop
> RPM/DEB build scripts to generate all these binaries in good package
> form for yourself? It also builds up the fuse extensions and has
> clearer dependencies/build-process. Check out their 0.3.x branches for
> 1.x specific build scripts via http://bigtop.apache.org
> Otherwise, the path you're already following (i.e. ant native builds)
> is the correct one and you should be able to get a successful result
> if you have all the necessary dependencies.
> On Fri, Feb 8, 2013 at 10:00 PM, Jean-Marc Spaggiari
> <[EMAIL PROTECTED]> wrote:
>> I'm wondering what's the best way to install FUSE with Hadoop 1.0.3?
>> I'm trying to follow all the steps described here:
>> http://wiki.apache.org/hadoop/MountableHDFS but it's failing on each
>> one, taking hours to fix it and move to the next one.
>> So I think I'm following the wrong path. There should be an easier
>> solution to install that.
>> I tried the fuse-j-hadoopfs option too, but the JAR is no more
>> available when we try to build.
>> Does anyone have already setup FUSE with Hadoop 1.0.x?
> Harsh J