Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> hadoop v0.23.9, namenode -format command results in Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode


Copy link to this message
-
Re: hadoop v0.23.9, namenode -format command results in Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode
thanks, i also tried using HADOOP_PREFIX but that didn't work. I still get
the same error:  Could not find or load main class
org.apache.hadoop.hdfs.server.namenode.NameNode

btw, how do we install hadoop-common and hadoop-hdfs?

also, according to this link,
http://hadoop.apache.org/docs/r0.23.9/hadoop-project-dist/hadoop-common/SingleCluster.html,
there are several other variables to set.

$HADOOP_COMMON_HOME
$HADOOP_HDFS_HOME
$HADOOP_MAPRED_HOME
$YARN_HOME

where do we set these directories? i tried to set these as follows, which
still did not help to get rid of the error message.

$HADOOP_COMMON_HOME=${HADOOP_PREFX}/share/hadoop/common
$HADOOP_HDFS_HOME=${HADOOP_PREFX}/share/hadoop/hdfs
$HADOOP_MAPRED_HOME=${HADOOP_PREFX}/share/hadoop/mapreduce
$YARN_HOME=${HADOOP_PREFX}/share/hadoop/yarn

interestingly, that link never talks about HADOOP_PREFIX.

On Sun, Aug 11, 2013 at 3:21 AM, Harsh J <[EMAIL PROTECTED]> wrote:

> I don't think you ought to be using HADOOP_HOME anymore.
>
> Try "unset HADOOP_HOME" and then "export HADOOP_PREFIX=/opt/hadoop"
> and retry the NN command.
>
> On Sun, Aug 11, 2013 at 8:50 AM, Jane Wayne <[EMAIL PROTECTED]>
> wrote:
> > hi,
> >
> > i have downloaded and untarred hadoop v0.23.9. i am trying to set up a
> > single node instance to learn this version of hadop. also, i am following
> > as best as i can, the instructions at
> >
> http://hadoop.apache.org/docs/r0.23.9/hadoop-project-dist/hadoop-common/SingleCluster.html
> > .
> >
> > when i attempt to run ${HADOOP_HOME}/bin/hdfs namenode -format, i get the
> > following error.
> >
> > Error: Could not find or load main class
> > org.apache.hadoop.hdfs.server.namenode.NameNode
> >
> > the instructions in the link above are complete. they jump right in and
> > say, "assuming you have installed hadoop-common/hadoop-hdfs..." what does
> > this assumption even mean? how do we install hadoop-common and
> hadoop-hdfs?
> >
> > right now, i am running on CentOS 6.4 x64 minimal. my steps are the
> > following.
> >
> > 0. installed jdk 1.7 (Oracle)
> > 1. tar xfz hadoop-0.23.9.tar.gz
> > 2. mv hadoop-0.23.9 /opt
> > 3. ln -s /opt/hadoop-0.23.9 /opt/hadoop
> > 4. export HADOOP_HOME=/opt/hadoop
> > 5. export JAVA_HOME=/opt/java
> > 6. export PATH=${JAVA_HOME}/bin:${HADOOP_HOME}/bin:${PATH}
> >
> > any help is appreciated.
>
>
>
> --
> Harsh J
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB