Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Hadoop >> mail # user >> Hadoop 1.0.3 setup


+
prabhu K 2012-07-09, 11:59
+
prabhu K 2012-07-09, 13:21
+
Nitin Pawar 2012-07-09, 13:23
+
prabhu K 2012-07-09, 13:58
Copy link to this message
-
Re: Hadoop 1.0.3 setup

On 07/09/2012 09:58 AM, prabhu K wrote:
> Yes, i have configuared multinode setup, 1 master 2 slaves,
>
> i have formated the namenode and then i run the stat-dfs.sh script and
> start-mapred.sh script.
>
> I run the bin/hadoop fs -put input input command , getting following error
> on my terminal.
>
> hduser@md-trngpoc1:/usr/local/hadoop_dir/hadoop$ bin/hadoop fs -put input
> input
> Warning: $HADOOP_HOME is deprecated.
> put: org.apache.hadoop.security.AccessControlException: Permission denied:
> user=hduser, access=WRITE, inode="":root:supergroup:rwxr-xr-x
> and executed the below command, getting /hadoop-install/hadoop directroy, i
> coud't understand what's wrong iam doing?
Well, this erros says to you that you have the wrong permissions in the
hadoop directory,
the user and group that you have is root:supergroup and the correct
values for it is:
  hduser:supergroup
>
> hduser@md-trngpoc1:/usr/local/hadoop_dir/hadoop$ echo $HADOOP_HOME
> /hadoop-install/hadoop
>
> *Namenode log:*
> =========>
> java.lang.InterruptedException: sleep interrupted
>          at java.lang.Thread.sleep(Native Method)
>          at
> org.apache.hadoop.hdfs.server.namenode.DecommissionManager$Monitor.run(DecommissionManager.java:65)
>          at java.lang.Thread.run(Thread.java:662)
> 2012-07-09 19:02:12,696 ERROR
> org.apache.hadoop.hdfs.server.namenode.NameNode: java.net.BindException:
> Problem binding to md-trngpoc1/10.5.114.110:54310 : Address alrea
> dy in use
It seems that you are using that address:port values.
Use this commands:
netstat -puta | grep namenode
netstat -puta | grep datanode

to check which are the ports that the NN and DN are using.
>          at org.apache.hadoop.ipc.Server.bind(Server.java:227)
>          at org.apache.hadoop.ipc.Server$Listener.<init>(Server.java:301)
>          at org.apache.hadoop.ipc.Server.<init>(Server.java:1483)
>          at org.apache.hadoop.ipc.RPC$Server.<init>(RPC.java:545)
>          at org.apache.hadoop.ipc.RPC.getServer(RPC.java:506)
>          at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:294)
>          at
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:496)
>          at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1279)
>          at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)
> Caused by: java.net.BindException: Address already in use
>          at sun.nio.ch.Net.bind(Native Method)
>          at
> sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:126)
>          at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:59)
>          at org.apache.hadoop.ipc.Server.bind(Server.java:225)
>          ... 8 more
> *Datanode log*
> ========================================> 2012-07-09 18:44:39,949 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = md-trngpoc3/10.5.114.168
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 1.0.3
> STARTUP_MSG:   build > https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> 1335192; compiled by 'hortonfo' on Tue May  8 20:31:25 UTC 2012
> ************************************************************/
> 2012-07-09 18:44:40,039 INFO org.apache.hadoop.metrics2.impl.MetricsConfig:
> loaded properties from hadoop-metrics2.properties
> 2012-07-09 18:44:40,047 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source
> MetricsSystem,sub=Stats registered.
> 2012-07-09 18:44:40,048 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot
> period at 10 second(s).
> 2012-07-09 18:44:40,048 INFO
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system
> started
> 2012-07-09 18:44:40,125 INFO
> org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi
> registered.
> 2012-07-09 18:44:40,163 WARN
Check the directories and its permissions for dfs.data.dir
Marcos Luis Ort�z Valmaseda
*Data Engineer && Sr. System Administrator at UCI*
10mo. ANIVERSARIO DE LA CREACION DE LA UNIVERSIDAD DE LAS CIENCIAS INFORMATICAS...
CONECTADOS AL FUTURO, CONECTADOS A LA REVOLUCION

http://www.uci.cu
http://www.facebook.com/universidad.uci
http://www.flickr.com/photos/universidad_uci
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB