Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Hadoop startup problem - directory name required


Copy link to this message
-
Re: Hadoop startup problem - directory name required
On Mon, Aug 23, 2010 at 12:09 PM, cliff palmer <[EMAIL PROTECTED]> wrote:
> The 3 *-site.xml files are in the /etc/hadoop-0.20/conf directory.  I've
> confirmed that these are the files being used.
> Thanks again.
> Cliff
>
> On Mon, Aug 23, 2010 at 10:26 AM, Harsh J <[EMAIL PROTECTED]> wrote:
>
>> Can you confirm that this is the right configuration your NN is starting
>> with?
>>
>> On Mon, Aug 23, 2010 at 7:19 PM, cliff palmer <[EMAIL PROTECTED]>
>> wrote:
>> > Thanks Harsh, but I am still not sure I understand what is going on.
>> > The directory specified in the dfs.name.dir property,
>> > "/var/lib/hadoop-0.20/dfsname", does exist and rights to that directory
>> have
>> > been granted to the OS user that is running the Hadoop startup script.
>> > The directory mentioned in the error message is
>> > "/var/lib/hadoop-0.20/cache/hadoop/dfs/name".
>> > I can create this directory and that would (I assume) remove the error,
>> but
>> > I want to understand how the name is derived.  It's not a child of the
>> > directory name specified in the dfs.name.dir property.
>> > Thanks again!
>> > Cliff
>> >
>> >
>> > On Mon, Aug 23, 2010 at 9:21 AM, Harsh J <[EMAIL PROTECTED]> wrote:
>> >
>> >> Its checking this directory cause your dfs.name.dir (hdfs-site.xml)
>> >> has it in its list of dirs to write a copy to:
>> >>
>> >> <property>
>> >>               <name>dfs.name.dir</name>
>> >>               <value>/DFS/dfsname,/var/lib/hadoop-0.20/dfsname</value>
>> >> <property>
>> >>
>> >> Remove it from this property if you don't need it. If you need
>> >> multiple dirs, you should create the path and allocate proper
>> >> permissions to it so that it may write to it. Its good to use a second
>> >> path for backup purposes (most have this on the NFS). The path is so
>> >> because your distribution of hadoop perhaps placed its files around
>> >> that place and its up in the conf files as some form of default :)
>> >>
>> >> On Mon, Aug 23, 2010 at 6:45 PM, cliff palmer <[EMAIL PROTECTED]>
>> >> wrote:
>> >> > The namenode log for a Hadoop-0.20 installation contains this error
>> >> message:
>> >> > "/var/lib/hadoop-0.20/cache/hadoop/dfs/name in in an inconsistent
>> state".
>> >> > This directory does not exist and I would like to understand why this
>> >> > particular directory name is required (not what the directory is used
>> >> for,
>> >> > but why this particular directory name).  The *-site.xml files are
>> below
>> >> (IP
>> >> > addresses have been masked).
>> >> > Thanks in advance for your help.
>> >> > Cliff
>> >> >
>> >> > core-site.xml:
>> >> >
>> >> > <?xml version="1.0"?>
>> >> > <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>> >> >
>> >> > <!-- Put site-specific property overrides in this file. -->
>> >> >
>> >> > <configuration>
>> >> >        <property>
>> >> >                <name>fs.default.name</name>
>> >> >                <value>hdfs://xxx.xxx.xxx.xxx:8020</value>
>> >> >                <final>true</final>
>> >> >        </property>
>> >> >        <property>
>> >> >                <name>hadoop.tmp.dir</name>
>> >> >                <value>/var/lib/hadoop-0.20/cache/hadoop</value>
>> >> >        </property>
>> >> > </configuration>
>> >> > -------- end of core-site.xml -----------
>> >> >
>> >> > hdfs-site.xml:
>> >> >
>> >> > <?xml version="1.0"?>
>> >> > <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>> >> >
>> >> > <!-- Put site-specific property overrides in this file. -->
>> >> >
>> >> > <configuration>
>> >> >        <property>
>> >> >                <name>dfs.replication</name>
>> >> >                <value>3</value>
>> >> >        </property>
>> >> >        <property>
>> >> >                <name>dfs.name.dir</name>
>> >> >
>>  <value>/DFS/dfsname,/var/lib/hadoop-0.20/dfsname</value>
>> >> >        <property>
>> >> >        </property>
>> >> >                <name>dfs.data.dir</name>
>> >> >
>>  <value>/DFS1/dfsdata,/DFS2/dfsdata,/DFS3/dfsdata</value>
>> >> >        </property>

Check the permissions of the paths above this one. Make sure parent
directories have execute(search) permission for the user you run
hadoop as.