Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Topology : Script Based Mapping


Copy link to this message
-
Re: Topology : Script Based Mapping
Hi,

On Tue, Dec 28, 2010 at 6:03 PM, Rajgopal Vaithiyanathan
<[EMAIL PROTECTED]> wrote:
> I wrote a script to map the IP's to a rack. The script is as follows. :
>
> for i in $* ; do
>        topo=`echo $i | cut -d"." -f1,2,3 | sed 's/\./-/g'`
>        topo=/rack-$topo" "
>        final=$final$topo
> done
> echo $final
>
> I also did ` chmod +x topology_script.sh`
>
> I tried a sample data :
>
> [joa@localhost bin]$ ./topology_script.sh 172.21.1.2 172.21.3.4
> /rack-172-21-1 /rack-172-21-3
>
> I also made the change in core-site.xml as follows.
>
> <property>
>  <name>topology.script.file.name</name>
>  <value>$HOME/sw/hadoop-0.20.2/bin/topology_script.sh</value>
> </property>
>

I am not sure if $HOME gets expanded automatically. Can you try it as
${HOME}, or in the worst case specify the expanded path.

Thanks
Hemanth
> But while starting the cluster, The namenode logs shows the error (listed
> below). and every IP gets mapped to the /default-rack
>
> Kindly help.:)
> Thanks in advance.
>
> 2010-12-28 17:30:50,549 WARN org.apache.hadoop.net.ScriptBasedMapping:
> java.io.IOException: Cannot run program
> "$HOME/sw/hadoop-0.20.2/bin/topology_script.sh" (in directory
> "/home/joa/sw/hadoop-0.20.2"): java.io.IOException: error=2, No such file or
> directory
>        at java.lang.ProcessBuilder.start(ProcessBuilder.java:474)
>        at org.apache.hadoop.util.Shell.runCommand(Shell.java:149)
>        at org.apache.hadoop.util.Shell.run(Shell.java:134)
>        at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286)
>        at
> org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.runResolveCommand(ScriptBasedMapping.java:148)
>        at
> org.apache.hadoop.net.ScriptBasedMapping$RawScriptBasedMapping.resolve(ScriptBasedMapping.java:94)
>        at
> org.apache.hadoop.net.CachedDNSToSwitchMapping.resolve(CachedDNSToSwitchMapping.java:59)
>        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.resolveNetworkLocation(FSNamesystem.java:2158)
>        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:2129)
>        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.register(NameNode.java:687)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>        at java.lang.reflect.Method.invoke(Method.java:616)
>        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
>        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
>        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at javax.security.auth.Subject.doAs(Subject.java:416)
>        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
> Caused by: java.io.IOException: java.io.IOException: error=2, No such file
> or
> directory
>        at java.lang.UNIXProcess.<init>(UNIXProcess.java:164)
>        at java.lang.ProcessImpl.start(ProcessImpl.java:81)
>        at java.lang.ProcessBuilder.start(ProcessBuilder.java:467)
>        ... 19 more
>
> --
> Thanks and Regards,
> Rajgopal Vaithiyanathan.
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB