Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop, mail # user - HTTP Error


Copy link to this message
-
Re: HTTP Error
Joey Echeverria 2011-07-08, 11:55
It looks like both datanodes are trying to serve data out of the smae directory. Is there any chance that both datanodes are using the same NFS mount for the dfs.data.dir?

If not, what I would do is delete the data from ${dfs.data.dir} and then re-format the namenode. You'll lose all of your data, hopefully that's not a problem at this time.

-Joey

On Jul 8, 2011, at 0:40, Adarsh Sharma <[EMAIL PROTECTED]> wrote:

> Thanks , Still don't understand the issue.
>
> My name node has repeatedly show these logs :
>
> 2011-07-08 09:36:31,365 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem.audit: ugi=hadoop,hadoop    ip=/MAster-IP   cmd=listStatus    src=/home/hadoop/system    dst=null    perm=null
> 2011-07-08 09:36:31,367 INFO org.apache.hadoop.ipc.Server: IPC Server handler 2 on 9000, call delete(/home/hadoop/system, true) from Master-IP:53593: error: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot delete /home/hadoop/system. Name node is in safe mode.
> The ratio of reported blocks 0.8293 has not reached the threshold 0.9990. Safe mode will be turned off automatically.
> org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot delete /home/hadoop/system. Name node is in safe mode.
> The ratio of reported blocks 0.8293 has not reached the threshold 0.9990. Safe mode will be turned off automatically.
>   at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:1700)
>   at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:1680)
>   at org.apache.hadoop.hdfs.server.namenode.NameNode.delete(NameNode.java:517)
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>   at java.lang.reflect.Method.invoke(Method.java:597)
>   at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
>   at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
>   at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
>   at java.security.AccessController.doPrivileged(Native Method)
>   at javax.security.auth.Subject.doAs(Subject.java:396)
>   at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
>
>
> And one of my data node shows the below logs :
>
> 2011-07-08 09:49:56,967 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeCommand action: DNA_REGISTER
> 2011-07-08 09:49:59,962 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: DataNode is shutting down: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.protocol.UnregisteredDatanodeException: Data node 192.168.0.209:50010 is attempting to report storage ID DS-218695497-SLave_IP-50010-1303978807280. Node SLave_IP:50010 is expected to serve this storage.
>       at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getDatanode(FSNamesystem.java:3920)
>       at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.processReport(FSNamesystem.java:2891)
>       at org.apache.hadoop.hdfs.server.namenode.NameNode.blockReport(NameNode.java:715)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>       at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>       at java.lang.reflect.Method.invoke(Method.java:597)
>       at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
>       at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
>       at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:396)
>       at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
>
>       at org.apache.hadoop.ipc.Client.call(Client.java:740)
>       at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)