Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> start-{dfs,mapred}.sh > Hadoop common not found


Copy link to this message
-
Re: start-{dfs,mapred}.sh > Hadoop common not found
Hi Martin,

This is a known bug, see https://issues.apache.org/jira/browse/HADOOP-6953.

Cheers
Tom

On Wed, Sep 22, 2010 at 8:17 AM, Martin Becker <[EMAIL PROTECTED]> wrote:
>  Hi,
>
> I am using Hadoop MapReduce 0.21.0. The usual process of starting
> Hadoop/HDFS/MapReduce was to use the "start-all.sh" script. Now when calling
> that script, it tell me that its usage is deprecated and I was to use
> "start-{dfs,mapred}.sh". But when I do so the error message "Hadoop common
> not found" will be thrown. I was looking through the script files and it
> seems that the problem are not set environment variables. That is
> HADOOP_HOME and HADOOP_COMMON_HOME. Now those are set in the
> hadoop-config.sh. Yet start-{dfs,mared}.sh on the other hand are looking for
> those two environment variables to call exactly that script file:
> hadoop-config.sh. Now that seems odd to me. So is there a way of starting
> Hadoop a non-deprecated way or is this a bug?
>
> Martin
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB