Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce, mail # user - Re: New hadoop 1.2 single node installation giving problems


Copy link to this message
-
Re: New hadoop 1.2 single node installation giving problems
Shekhar Sharma 2013-07-23, 17:07
Its warning not error...

Create a directory and then do ls ( In your case /user/hduser is not
created untill and unless for the first time you create a directory or put
some file)

hadoop fs  -mkdir sample

hadoop fs  -ls

I would suggest if you are getting pemission problem,
please check the following:

(1) Have you run the command "hadoop namenode -format" with different user
and you are accessing the hdfs with different user?

On Tue, Jul 23, 2013 at 10:10 PM, <[EMAIL PROTECTED]> wrote:

> **
> Hi Ashish
>
> In your hdfs-site.xml within <configuration> tag you need to have the
> <property> tag and inside a <property> tag you can have <name>,<value> and
> <description> tags.
>
> Regards
> Bejoy KS
>
> Sent from remote device, Please excuse typos
> ------------------------------
> *From: * Ashish Umrani <[EMAIL PROTECTED]>
> *Date: *Tue, 23 Jul 2013 09:28:00 -0700
> *To: *<[EMAIL PROTECTED]>
> *ReplyTo: * [EMAIL PROTECTED]
> *Subject: *Re: New hadoop 1.2 single node installation giving problems
>
> Hey thanks for response.  I have changed 4 files during installation
>
> core-site.xml
> mapred-site.xml
> hdfs-site.xml   and
> hadoop-env.sh
>
>
> I could not find any issues except that all params in the hadoop-env.sh
> are commented out.  Only java_home is un commented.
>
> If you have a quick minute can you please browse through these files in
> email and let me know where could be the issue.
>
> Regards
> ashish
>
>
>
> I am listing those files below.
> *core-site.xml *
> <?xml version="1.0"?>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>
> <!-- Put site-specific property overrides in this file. -->
>
> <configuration>
>   <property>
>     <name>hadoop.tmp.dir</name>
>     <value>/app/hadoop/tmp</value>
>     <description>A base for other temporary directories.</description>
>   </property>
>
>   <property>
>     <name>fs.default.name</name>
>     <value>hdfs://localhost:54310</value>
>     <description>The name of the default file system.  A URI whose
>     scheme and authority determine the FileSystem implementation.  The
>     uri's scheme determines the config property (fs.SCHEME.impl) naming
>     the FileSystem implementation class.  The uri's authority is used to
>     determine the host, port, etc. for a filesystem.</description>
>   </property>
> </configuration>
>
>
>
> *mapred-site.xml*
> <?xml version="1.0"?>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>
> <!-- Put site-specific property overrides in this file. -->
>
> <configuration>
>   <property>
>     <name>mapred.job.tracker</name>
>     <value>localhost:54311</value>
>     <description>The host and port that the MapReduce job tracker runs
>     at.  If "local", then jobs are run in-process as a single map
>     and reduce task.
>     </description>
>   </property>
> </configuration>
>
>
>
> *hdfs-site.xml   and*
> <?xml version="1.0"?>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>
> <!-- Put site-specific property overrides in this file. -->
>
> <configuration>
>   <name>dfs.replication</name>
>   <value>1</value>
>   <description>Default block replication.
>     The actual number of replications can be specified when the file is
> created.
>     The default is used if replication is not specified in create time.
>   </description>
> </configuration>
>
>
>
> *hadoop-env.sh*
> # Set Hadoop-specific environment variables here.
>
> # The only required environment variable is JAVA_HOME.  All others are
> # optional.  When running a distributed configuration it is best to
> # set JAVA_HOME in this file, so that it is correctly defined on
> # remote nodes.
>
> # The java implementation to use.  Required.
> export JAVA_HOME=/usr/lib/jvm/jdk1.7.0_25
>
> # Extra Java CLASSPATH elements.  Optional.
> # export HADOOP_CLASSPATH>
>
> All pther params in hadoop-env.sh are commented
>
>
>
>
>
>
>
>
> On Tue, Jul 23, 2013 at 8:38 AM, Jitendra Yadav <
> [EMAIL PROTECTED]> wrote:
>
>> Hi,
>>