Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HDFS >> mail # user >> problems with hadoop-0.20.205.0


Copy link to this message
-
Re: problems with hadoop-0.20.205.0
Hi Harsh,

     Here is the dtatnode log file -

2011-11-28 12:33:53,584 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = ubuntu/127.0.1.1
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 0.20.205.0
STARTUP_MSG:   build https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20-security-205
-r 1179940; compiled by 'hortonfo' on Fri Oct  7 06:20:32 UTC 2011
************************************************************/
2011-11-28 12:33:53,685 INFO
org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from
hadoop-metrics2.properties
2011-11-28 12:33:53,693 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
MetricsSystem,sub=Stats registered.
2011-11-28 12:33:53,693 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
period at 10 second(s).
2011-11-28 12:33:53,693 INFO
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics
system started
2011-11-28 12:33:53,792 INFO
org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
ugi registered.
2011-11-28 12:33:53,795 WARN
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi
already exists!
2011-11-28 12:33:59,659 INFO
org.apache.hadoop.hdfs.server.common.Storage: Storage directory
/home/solr/hadoop-0.20.205.0/file:/home/solr/hdfs/data does not exist.
2011-11-28 12:33:59,660 INFO
org.apache.hadoop.hdfs.server.common.Storage: Storage directory
file:/home/solr/hdfs/data does not exist.
2011-11-28 12:33:59,821 ERROR
org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException:
All specified directories are not accessible or do not exist.
at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:139)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:367)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:281)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1545)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1484)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1502)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1628)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1645)

2011-11-28 12:33:59,822 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at ubuntu/127.0.1.1
************************************************************/

The namenode is working fine now, but the datanode problem still persists.

Regards,
    Mohammad Tariq

On Sun, Nov 27, 2011 at 7:53 PM, Harsh J <[EMAIL PROTECTED]> wrote:
> Perhaps you need to carry out an upgrade of DFS before you start it
> normally. Could you share what error you're seeing when you start your
> namenode (from log files, on cli, etc.)?
>
> On Sun, Nov 27, 2011 at 6:40 PM, Mohammad Tariq <[EMAIL PROTECTED]> wrote:
>> I am able to work properly when using 0.20.203.0, but facing problem
>> with 0.20.205.0..The namenode and datanode are not getting started.Is
>> there any change in the command syntax or in the configuration???
>>
>> Regards,
>>     Mohammad Tariq
>>
>
>
>
> --
> Harsh J
>