Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> Rack Aware Hadoop cluster


Copy link to this message
-
Re: Rack Aware Hadoop cluster
Here is a sample I stole from the web and modified slightly... I think.

HADOOP_CONF=/etc/hadoop/conf

while [ $# -gt 0 ] ; do
  nodeArg=$1
  exec< ${HADOOP_CONF}/rack_info.txt
  result=""
  while read line ; do
    ar=( $line )
    if [ "${ar[0]}" = "$nodeArg" ] ; then
      result="${ar[1]}"
    fi
  done
  shift
  if [ -z "$result" ] ; then
    echo -n "/default/rack "
  else
    echo -n "$result "
  fi

done
The rack_info.txt file contains all hostname AND IP addresses for each node:
10.10.10.10  /dc1/rack1
10.10.10.11  /dc1/rack2
datanode1  /dc1/rack1
datanode2  /dc1/rack2
.. etch.
On Wed, May 8, 2013 at 1:38 PM, Adam Faris <[EMAIL PROTECTED]> wrote:

> Look between the <code> blocks starting at line 1336.
> http://lnkd.in/rJsqpV   Some day it will get included in the
> documentation with a future Hadoop release. :)
>
> -- Adam
>
> On May 8, 2013, at 10:29 AM, Mohammad Mustaqeem <[EMAIL PROTECTED]>
>  wrote:
>
> > If anybody have sample (topology.script.file.name) script then please
> share it.
> >
> >
> > On Wed, May 8, 2013 at 10:30 PM, Mohammad Mustaqeem <
> [EMAIL PROTECTED]> wrote:
> > @chris, I have test it outside. It is working fine.
> >
> >
> > On Wed, May 8, 2013 at 7:48 PM, Leonid Fedotov <[EMAIL PROTECTED]>
> wrote:
> > Error in script.
> >
> >
> > On Wed, May 8, 2013 at 7:11 AM, Chris Embree <[EMAIL PROTECTED]> wrote:
> > Your script has an error in it.  Please test your script using both IP
> Addresses and Names, outside of hadoop.
> >
> >
> > On Wed, May 8, 2013 at 10:01 AM, Mohammad Mustaqeem <
> [EMAIL PROTECTED]> wrote:
> > I have done this and found following error in log -
> >
> >
> > 2013-05-08 18:53:45,221 WARN org.apache.hadoop.net.ScriptBasedMapping:
> Exception running
> /home/mustaqeem/development/hadoop-2.0.3-alpha/etc/hadoop/rack.sh 127.0.0.1
> > org.apache.hadoop.util.Shell$ExitCodeException:
> /home/mustaqeem/development/hadoop-2.0.3-alpha/etc/hadoop/rack.sh: 8:
> /home/mustaqeem/development/hadoop-2.0.3-alpha/etc/hadoop/rack.sh: Syntax
> error: "(" unexpected (expecting "done")
> >
> >       at org.apache.hadoop.util.Shell.runCommand(Shell.java:202)
> >       at org.apache.hadoop.util.Shell.run(Shell.java:129)
> >       at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:322)
> >       at
> org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.runResolveCommand(ScriptBasedMapping.java:241)
> >       at
> org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.resolve(ScriptBasedMapping.java:179)
> >       at
> org.apache.hadoop.net.CachedDNSToSwitchMapping.resolve(CachedDNSToSwitchMapping.java:119)
> >       at
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.resolveNetworkLocation(DatanodeManager.java:454)
> >       at
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:713)
> >       at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3459)
> >       at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:881)
> >       at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
> >       at
> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:18295)
> >       at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:454)
> >       at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1014)
> >       at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1735)
> >       at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1731)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:415)
> >       at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1441)
> >       at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1729)
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB