Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Hadoop >> mail # user >> No bulid.xml when to build FUSE


+
YouPeng Yang 2013-04-10, 14:06
+
Harsh J 2013-04-10, 15:16
+
Jay Vyas 2013-04-10, 14:28
+
yypvsxf19870706 2013-04-10, 15:07
Copy link to this message
-
Re: No bulid.xml when to build FUSE
Hi Harsh

   I have found out the reason and the solutions after I check fuse-dfs
source code.So I reply again to close this question.

   The reason that error come out for is  the hadoop-*-.jars need to be in
the  CLASSPATH.So I add them to the CLASSPATH,and it work.

  Thank you
Regards.
2013/4/12 YouPeng Yang <[EMAIL PROTECTED]>

> Hi Harsh
>
>    Sorry for replying so later.
>
>    I still get some errors.
>
>    After I run the command  under hadoop-hdfs project:
>    *mvn install -Drequire.fuse=true -DskipTests*
>     I still can not find the  fuse_dfs binary script in my system by
> running the command:
>    * find / -name fuse_dfs.*
>
>    Thank to the google .I try *mvn package -Pnative -DskipTests* ,and the fuse_dfs
> comes out.
>
>     So,I run the command as user hadoop with an error :
>     [hadoop@Hadoop fuse-dfs]$ *./fuse_dfs_wrapper.sh hdfs://Hadoop:8020/
> /home/hadoop/expdfs/*
>     INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
> Adding FUSE arg /home/hadoop/expdfs/
>     fuse: failed to exec fusermount: Permission denied
>
>     According to error,i switch to the user root,and run the command in
> debug mod  :
>
>    [root@Hadoop fuse-dfs]# *./fuse_dfs_wrapper.sh -d  hdfs://
> 192.168.1.150:8080/ /home/hadoop/expdfs/*
> INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:115
> Ignoring option -d
> INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
> Adding FUSE arg /home/hadoop/expdfs/
> FUSE library version: 2.8.3
> nullpath_ok: 0
> unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
> INIT: 7.13
> flags=0x0000007b
> max_readahead=0x00020000
> INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:98
> Mounting with options: [ protected=(NULL), nn_uri=hdfs://
> 192.168.1.150:8080/, nn_port=0, debug=0, read_only=0, initchecks=0,
> no_permissions=0, usetrash=0, entry_timeout=60, attribute_timeout=60,
> rdbuffer_size=10485760, direct_io=0 ]
> loadFileSystems error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
> ExceptionUtils::getStackTrace error.)
> hdfsConfGetInt(hadoop.fuse.timer.period): new Configuration error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
> ExceptionUtils::getStackTrace error.)
> Unable to determine the configured value for
> hadoop.fuse.timer.period.ERROR
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:135
> FATAL: dfs_init: fuseConnectInit failed with error -22!
> ERROR
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:34
> LD_LIBRARY_PATH=/home/oracle/database/product/10.2.0/db_1/lib:
> ERROR
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:35
> CLASSPATH=.:/usr/local/java/lib/dt.jar:/usr/local/java/lib/tools.jar
>
>
>    I get stuck with the FATAL error.
>    Please give me some suggestion.
>
>   Any help will be apprecated.
>
>
> Regards
>
>
>
>
>
>
>
> 2013/4/11 Harsh J <[EMAIL PROTECTED]>
>
>> Hi,
>>
>> You need to place fuse_dfs' binary directory on your PATH if you
>> expect to use that script - it is simply looking it up as a command
>> and not finding it. I usually just invoke the fuse_dfs binary directly
>> since my environment is usually pre-setup.
>>
>> On Thu, Apr 11, 2013 at 8:19 PM, YouPeng Yang <[EMAIL PROTECTED]>
>> wrote:
>> > Hi Harsh:
>> >
>> >  I run under hadoop-hdfs project:
>> >  mvn install -Drequire.fuse=true -DskipTests
>> >
>> >  and the logs show: BUILD SUCCESS
>> >
>> >  So I go to the
>> > src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/ to run the
>> > fuse_dfs_wrapper.sh:
>> > [root@Hadoop fuse-dfs]# ./fuse_dfs_wrapper.sh  hdfs://Hadoop:8020 /mnt/
>> > ./fuse_dfs_wrapper.sh: line 46: fuse_dfs: command not found
>> >
>> > Obviously,the above error shows that something is still abnormal.
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB