-Re: stuck in safe mode after restarting dfs after found dead node
Edward Capriolo 2012-07-13, 15:10
If the datanode is not coming back you have to explicitly tell hadoop
to leave safemode.
hadoop dfsadmin -safemode leave
On Fri, Jul 13, 2012 at 9:35 AM, Juan Pino <[EMAIL PROTECTED]> wrote:
> I can't get HDFS to leave safe mode automatically. Here is what I did:
> -- there was a dead node
> -- I stopped dfs
> -- I restarted dfs
> -- Safe mode wouldn't leave automatically
> I am using hadoop-1.0.2
> Here are the logs:
> end of hadoop-hadoop-namenode.log (attached):
> 2012-07-13 13:22:29,372 INFO org.apache.hadoop.hdfs.StateChange: STATE* Safe
> mode ON.
> The ratio of reported blocks 0.9795 has not reached the threshold 0.9990.
> Safe mode will be turned off automatically.
> 2012-07-13 13:22:29,375 INFO org.apache.hadoop.hdfs.StateChange: STATE* Safe
> mode extension entered.
> The ratio of reported blocks 0.9990 has reached the threshold 0.9990. Safe
> mode will be turned off automatically in 29 seconds.
> 2012-07-13 13:22:29,375 INFO org.apache.hadoop.hdfs.StateChange: *BLOCK*
> NameSystem.processReport: from , blocks: 3128, processing time: 4 msecs
> 2012-07-13 13:31:29,201 INFO org.apache.hadoop.hdfs.StateChange: BLOCK*
> NameSystem.processReport: discarded non-initial block report from because
> namenode still in startup phase
> Any help would be greatly appreciated.