Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> hadoop dfs -ls


Copy link to this message
-
RE: hadoop dfs -ls
Hi Nitin,

Normally your conf should reside in /etc/hadoop/conf (if you don't have one. Copy it from the namenode - and keep it sync)

hadoop (script) by default depends on hadoop-setup.sh which depends on hadoop-env.sh in /etc/hadoop/conf

Or during runtime specify the config dir

i.e:

[hdfs]$  hadoop [--config <path to your config dir>] <commands>

P.S. Some useful links:

http://wiki.apache.org/hadoop/FAQ

http://wiki.apache.org/hadoop/FrontPage

http://wiki.apache.org/hadoop/

http://hadoop.apache.org/common/docs/r1.0.3/

-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]] On Behalf Of Dave Beech
Sent: Friday, July 13, 2012 6:18 AM
To: [EMAIL PROTECTED]
Subject: Re: hadoop dfs -ls

Hi Nitin

It's likely that your hadoop command isn't finding the right configuration.

In particular it doesn't know where your namenode is (fs.default.namesetting in core-site.xml)

Maybe you need to set the HADOOP_CONF_DIR environment variable to point to your conf directory.

Dave

On 13 July 2012 14:11, Nitin Pawar <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>> wrote:

> Hi,

>

> I have done setup numerous times but this time i did after some break.

>

> I managed to get the cluster up and running fine but when I do  hadoop

> dfs -ls /

>

> it actually shows me contents of linux file system

>

> I am using hadoop-1.0.3 on rhel5.6

>

> Can anyone suggest what I must have done wrong?

>

> --

> Nitin Pawar

>