Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Re: org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeExceptio


Copy link to this message
-
Re: org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeExceptio
Your hdfs-site.xml on the NN defines an "includes" file, but the
includes file does not list this connecting DN's proper hostname/IP,
causing the NN to reject it when it tries to ask itself to be
registered upon startup.

On Thu, Jan 9, 2014 at 4:05 PM, Pedro Sa da Costa <[EMAIL PROTECTED]> wrote:
>
> When I try to launch the namenode and the datanode in MRv2, the datanode
> can't connect to the namenode, giving me the error below. I also put the
> core-site file that I use below.
>
> The Firewall in the hosts is disabled. I don't have excluded nodes defined.
> Why the datanodes can't connect to the namenode?  Any help to solve this
> problem?
>
>
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
> Datanode denied communication with namenode: DatanodeRegistrati
> on(0.0.0.0, storageID=DS-1449645935-172.16.1.10-50010-1389224474955,
> infoPort=50075, ipcPort=50020,
> storageInfo=lv=-40;cid=CID-9a8571a3-17ae-49b2-b957-b009e88b9f9a;nsid=9
> 34416283;c=0)
>         at
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:631)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3398)
>         at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:881)
>         at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>         at
> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:18295)
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:454)
>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1014)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1741)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1737)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1478)
>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1735)
>
>         at org.apache.hadoop.ipc.Client.call(Client.java:1235)
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>         at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:622)
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>         at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>         at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>         at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>         at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>         at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>         at java.lang.Thread.run(Thread.java:701)
>
> I set the core-site.xml
>
> <configuration>
>   <property> <name>fs.default.name</name>
> <value>hdfs://10.103.0.17:9000</value> </property>
>   <property> <name>hadoop.tmp.dir</name> <value>/tmp/hadoop-temp</value>
> </property>
>
> <property><name>hadoop.proxyuser.root.hosts</name><value>*</value></property>

Harsh J
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB