Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Hadoop >> mail # user >> hadoop-fuse unable to find java


+
John Bond 2011-09-05, 14:08
+
John Bond 2011-11-29, 19:11
Copy link to this message
-
Re: hadoop-fuse unable to find java
Hi,

This specific issue is probably more appropriate on the CDH-USER list.
(BCC common-user) It looks like the JRE detection mechanism recently
added to BIGTOP would have this same issue:
https://issues.apache.org/jira/browse/BIGTOP-25

To resolve the immediate issue I would set an environment variable in
/etc/default/hadoop-0.20 or haoop-env.sh. You could set it static to a
particular version or perhaps use:
export JAVA_HOME=$(readlink -f /usr/java/latest)

Ultimately I think this will be fixed in BigTop but also may need to
be fixed in CDH3. As such I have filed a JIRA for you:

https://issues.cloudera.org/browse/DISTRO-349

If you are interested in seeing how the issue progresses you can
"Watch" the issue and receive email updates.

Cheers,
Brock

On Tue, Nov 29, 2011 at 1:11 PM, John Bond <[EMAIL PROTECTED]> wrote:
> Still getting this using
>
> Hadoop 0.20.2-cdh3u2
>
>
>
> On 5 September 2011 16:08, John Bond <[EMAIL PROTECTED]> wrote:
>> I have recently rebuilt a server with centos 6.0 and it seems that
>> something caused hadoop-fuse to get confused and it is no longer able
>> to find libjvm.so.  The error i get is
>>
>> find: `/usr/lib/jvm/java-1.6.0-sun-1.6.0.14/jre//jre/lib': No such
>> file or directory
>> /usr/lib/hadoop-0.20/bin/fuse_dfs: error while loading shared
>> libraries: libjvm.so: cannot open shared object file: No such file or
>> directory
>>
>> A dirty look around suggests /usr/lib/hadoop-0.20/bin/hadoop-config.sh
>> is setting  JAVA_HOME to `/usr/lib/jvm/java-1.6.0-sun-1.6.0.14/jre/`
>>
>> /usr/bin/hadoop-fuse-dfs has the following which adds an extra /jre/
>> to the path
>>
>>  for f in `find ${JAVA_HOME}/jre/lib -name client -prune -o -name
>> libjvm.so -exec dirname {} \;`; do
>>
>> is there a need to specify the subfolder.  I think it would make
>> things simpler to just change the above to
>>
>>  for f in `find ${JAVA_HOME} -name client -prune -o -name libjvm.so
>> -exec dirname {} \;`; do
>>
>>
>> The other option is to change
>> /usr/lib/hadoop-0.20/bin/hadoop-config.sh so it sets the path without
>> jre either remove ` /usr/lib/jvm/java-1.6.0-sun-1.6.0.*/jre/ \`.  Or
>> reorder the search list so     /usr/lib/jvm/java-1.6.0-sun-1.6.0.*/ \
>> is preferred
>>
>> regards
>> John
>>
>> hadoop-fuse-dfs
>> @@ -14,7 +14,7 @@
>>
>>  if [ "${LD_LIBRARY_PATH}" = "" ]; then
>>   export LD_LIBRARY_PATH=/usr/lib
>> -  for f in `find ${JAVA_HOME} -name client -prune -o -name libjvm.so
>> -exec dirname {} \;`; do
>> +  for f in `find ${JAVA_HOME}/jre/lib -name client -prune -o -name
>> libjvm.so -exec dirname {} \;`; do
>>     export LD_LIBRARY_PATH=$f:${LD_LIBRARY_PATH}
>>   done
>>  fi
>>
>> hadoop-config.sh
>> @@ -68,8 +68,8 @@
>>  if [ -z "$JAVA_HOME" ]; then
>>   for candidate in \
>>     /usr/lib/jvm/java-6-sun \
>> -    /usr/lib/jvm/java-1.6.0-sun-1.6.0.* \
>>     /usr/lib/jvm/java-1.6.0-sun-1.6.0.*/jre/ \
>> +    /usr/lib/jvm/java-1.6.0-sun-1.6.0.* \
>>     /usr/lib/j2sdk1.6-sun \
>>     /usr/java/jdk1.6* \
>>     /usr/java/jre1.6* \
>
+
Harsh J 2011-11-30, 04:47
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB