Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HDFS >> mail # dev >> [Hadoop]Environment variable CLASSPATH not set!

Copy link to this message
RE: [Hadoop]Environment variable CLASSPATH not set!
Env variables hang off of the session context and are specific to both the user profile and their shell-specific preferences.  If your driver is loading in kernel mode, it cannot depend on env variables.

This will be a problem for the other environment variables like hadoop_home.

Instead of using Java directly in kernel mode, I suggest splitting the problem:
1. fs abstraction for the kernel
   a. Like the nfs filesystem kernel driver implementation for example -- a remote mount fs.
   b. use a c impl of the protocol
      I. To avoid issues, use hadoop 2.0 for protobuffs, since they yield a versioned protocol to avoid hangs and dumps when the protocol changes.
      II.  OR push most of your implementation into a proxy service
          a. Surface NFS directly, and just use the nfs kernel driver
          b. Surface your own protocol to be consumed in the kernel mode driver.
2.  Start hdfs elsewhere, as a independent service in user mode like cups, httpd, or xinetd.
    a.  Will have a session and the ability to configure env vars.
Not sure if that exactly answers the question, but I hope it was helpful.


Sent from my Windows Phone
From: harryxiyou<mailto:[EMAIL PROTECTED]>
Sent: ‎2/‎9/‎2013 5:35 AM
Subject: [Hadoop]Environment variable CLASSPATH not set!

Hi all,

We are developing  a hdfs-based File system, which is HLFS(
http://code.google.com/p/cloudxy/wiki/WHAT_IS_CLOUDXY). Now, we
have developed HLFS driver for Libvirt(http://libvirt.org/). But when i
boot a VM from a base linux OS, which the OS have been installed into our
HLFS block device at first. However, it(HDFS or JVM) says i have not set the
CLASSPATH like following.

Environment variable CLASSPATH not set!
fs is null, hdfsConnect error!
Actually, i have set CLASSPATH in ~/.bashrc like following. I have
installed CDH3u2
for developing. I can do other hdfs jobs successfully.

$ cat /home/jiawei/.bashrc
export HLFS_HOME=/home/jiawei/workshop3/hlfs
export LOG_HOME=$HLFS_HOME/3part/log
export SNAPPY_HOME=$HLFS_HOME/3part/snappy
export HADOOP_HOME=$HLFS_HOME/3part/hadoop
export JAVA_HOME=/usr/lib/jvm/java-6-sun
export PATH=/usr/bin/:/usr/local/bin/:/bin/:/usr/sbin/:/sbin/:$JAVA_HOME/bin/
export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/i386/server/:$HADOOP_HOME/lib32/:$LOG_HOME/lib32/:$SNAPPY_HOME/lib32/:$HLFS_HOME/output/lib32/:/usr/lib/
export PKG_CONFIG_PATH=/usr/lib/pkgconfig/:/usr/share/pkgconfig/
export CFLAGS="-L/usr/lib -L/lib -L/usr/lib64"
export CXXFLAGS="-L/usr/lib -L/lib"
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/htmlconverter.jar:$JAVA_HOME/lib/jconsole.jar:$JAVA_HOME/lib/jconsole.jar:$JAVA_HOME/lib/tools.jar:$JAVA_HOME/jre/lib/charsets.jar:$JAVA_HOME/jre/lib/deploy.jar:$JAVA_HOME/jre/lib/javaws.jar:$JAVA_HOME/jre/lib/jce.jar:$JAVA_HOME/jre/lib/jsse.jar:$JAVA_HOME/jre/lib/management-agent.jar:$JAVA_HOME/jre/lib/plugin.jar:$JAVA_HOME/jre/lib/resources.jar:$JAVA_HOME/jre/lib/rt.jar:$JAVA_HOME/jre/lib/:$JAVA_HOME/lib/:/usr/lib/hadoop-0.20/conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/usr/lib/hadoop-0.20:/usr/lib/hadoop-0.20/hadoop-core-0.20.2-cdh3u2.jar:/usr/lib/hadoop-0.20/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop-0.20/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop-0.20/lib/commons-cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons-codec-1.4.jar:/usr/lib/hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop-0.20/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop-0.20/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-net-1.4.1.jar:/usr/lib/hadoop-0.20/lib/core-3.1.1.jar:/usr/lib/hadoop-0.20/lib/hadoop-fairscheduler-0.20.2-cdh3u2.jar:/usr/lib/hadoop-0.20/lib/hsqldb-

I have reproted this matter as an issue to handle, which you can see
http://code.google.com/p/cloudxy/issues/detail?id=37  for details.

Could anyone give me some suggestions?
Thanks a lot in advance ;-)

Harry Wei
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB