Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HDFS >> mail # user >> The minimum memory requirements to datanode and namenode?


Copy link to this message
-
Re: The minimum memory requirements to datanode and namenode?
can you tell specs of node3. Even on a test/demo cluster, anything below 4
GB ram makes the node almost inaccessible as per my experience.

On Sun, May 12, 2013 at 8:25 PM, sam liu <[EMAIL PROTECTED]> wrote:

> Got some exceptions on node3:
> 1. datanode log:
> 2013-04-17 11:13:44,719 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: writeBlock
> blk_2478755809192724446_1477 received exception
> java.net.SocketTimeoutException: 63000 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/9.50.102.80:58371remote=/
> 9.50.102.79:50010]
> 2013-04-17 11:13:44,721 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: DatanodeRegistration(
> 9.50.102.80:50010,
> storageID=DS-2038715921-9.50.102.80-50010-1366091297051, infoPort=50075,
> ipcPort=50020):DataXceiver
> java.net.SocketTimeoutException: 63000 millis timeout while waiting for
> channel to be ready for read. ch :
> java.nio.channels.SocketChannel[connected local=/9.50.102.80:58371remote=/
> 9.50.102.79:50010]
>         at
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>         at
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>         at
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>         at
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:116)
>         at java.io.DataInputStream.readShort(DataInputStream.java:306)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:359)
>         at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:112)
>         at java.lang.Thread.run(Thread.java:738)
> 2013-04-17 11:13:44,818 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving block
> blk_8413378381769505032_1477 src: /9.50.102.81:35279 dest: /
> 9.50.102.80:50010
>
>
> 2. tasktracker log:
> 2013-04-23 11:48:26,783 INFO org.apache.hadoop.mapred.UserLogCleaner:
> Deleting user log path job_201304152248_0011
> 2013-04-30 14:48:15,506 ERROR org.apache.hadoop.mapred.TaskTracker: Caught
> exception: java.io.IOException: Call to node1/9.50.102.81:9001 failed on
> local exception: java.io.IOException: Connection reset by peer
>         at org.apache.hadoop.ipc.Client.wrapException(Client.java:1144)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1112)
>         at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
>         at org.apache.hadoop.mapred.$Proxy2.heartbeat(Unknown Source)
>         at
> org.apache.hadoop.mapred.TaskTracker.transmitHeartBeat(TaskTracker.java:2008)
>         at
> org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:1802)
>         at org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:2654)
>         at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3909)
> Caused by: java.io.IOException: Connection reset by peer
>         at sun.nio.ch.FileDispatcher.read0(Native Method)
>         at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:33)
>         at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:210)
>         at sun.nio.ch.IOUtil.read(IOUtil.java:183)
>         at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:257)
>         at
> org.apache.hadoop.net.SocketInputStream$Reader.performIO(SocketInputStream.java:55)
>         at
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
>         at
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:155)
>         at
> org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:128)
>         at java.io.FilterInputStream.read(FilterInputStream.java:127)
>         at
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:361)
>         at java.io.BufferedInputStream.fill(BufferedInputStream.java:229)
>         at java.io.BufferedInputStream.read(BufferedInputStream.java:248)
>         at java.io.DataInputStream.readInt(DataInputStream.java:381)
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB