Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop, mail # general - bin/hadoop namenode -format          IOException: Invalid argument


Copy link to this message
-
Re: bin/hadoop namenode -format          IOException: Invalid argument
Konstantin Shvachko 2010-03-24, 18:13
This may be a problem with the underlying local file system.
Some file systems just don't support locks. Some NFS, e.g.
Some may have buggy native java implementation.

Are your name-node directories in /tmp, which is the default?
/tmp can behave strangely.
You should set "dfs.name.dir" pointing to a local HD directory in hdfs-site.xml.

--Konstantin

On 3/24/2010 3:17 AM, Steve Loughran wrote:
> Gary Yang wrote:
>> No. The namenode is not running. "bin/hadoop namenode -format" was the
>> very first command. I have not got chance to start the namenode yet.
>> Any idea?
>
>>>
>>> 10/03/23 11:54:56 ERROR namenode.NameNode:
>>> java.io.IOException: Invalid argument
>>>         at
>>> sun.nio.ch.FileChannelImpl.tryLock(FileChannelImpl.java:900)
>>>         at
>>> java.nio.channels.FileChannel.tryLock(FileChannel.java:974)
>>>         at
>>> org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.tryLock(Storage.java:527)
>>>
>>>         at
>>> org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.lock(Storage.java:505)
>>>
>>>         at
>>> org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:1087)
>>>         at
>>> org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:1110)
>>>         at
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:856)
>>>
>>>         at
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:948)
>>>
>>>         at
>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:965)
>
>
> That could be the filesystem being unhappy about some directory
>
> Check all your namenode dir settings, make sure they are valid paths,
> try to create them as the hadoop user