Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce, mail # user - Rack Aware Hadoop cluster


Copy link to this message
-
Re: Rack Aware Hadoop cluster
Mohammad Mustaqeem 2013-05-09, 19:00
That problem is being resolved.
But a new issue arise. When I start the dfs, I got the a line in namenode
log - "2013-05-09 15:29:44, 270 INFO logs: Aliases are enabled" What its
mean?
After it, nothing happens. JPS shows that datanode is running but web
interface for dfshealth is not running.
Any Idea?
Please help.
On Thu, May 9, 2013 at 10:00 PM, Mohammad Mustaqeem
<[EMAIL PROTECTED]>wrote:

> That problem is being resolved.
> But a new issue rises. When I start the dfs, I found a line namenode log -
> "2013-05-09 15:29:44,270 INFO logs: Aliases are enabled" What does it
> mean.
> After it nothing happens. JPS shows that datanode is running but the web
> interface for dfshealth is not running.
> Any Idea?
> Please help
>
>
> On Thu, May 9, 2013 at 3:37 PM, Mohammad Mustaqeem <[EMAIL PROTECTED]
> > wrote:
>
>> That problem is being resolved.
>> But a new issue rises. When I start the dfs, I found a line namenode log
>> - "2013-05-09 15:29:44,270 INFO logs: Aliases are enabled" What does it
>> mean.
>> After it nothing happens. JPS shows that datanode is running but the web
>> interface for dfshealth is not running.
>> Any Idea?
>> Please help.
>>
>>
>> On Thu, May 9, 2013 at 2:22 AM, Serge Blazhievsky <[EMAIL PROTECTED]>wrote:
>>
>>> That's one I use too I think it's on apache web site
>>>
>>> Sent from my iPhone
>>>
>>> On May 8, 2013, at 1:49 PM, Chris Embree <[EMAIL PROTECTED]> wrote:
>>>
>>> Here is a sample I stole from the web and modified slightly... I think.
>>>
>>> HADOOP_CONF=/etc/hadoop/conf
>>>
>>> while [ $# -gt 0 ] ; do
>>>   nodeArg=$1
>>>   exec< ${HADOOP_CONF}/rack_info.txt
>>>   result=""
>>>   while read line ; do
>>>     ar=( $line )
>>>     if [ "${ar[0]}" = "$nodeArg" ] ; then
>>>       result="${ar[1]}"
>>>     fi
>>>   done
>>>   shift
>>>   if [ -z "$result" ] ; then
>>>     echo -n "/default/rack "
>>>   else
>>>     echo -n "$result "
>>>   fi
>>>
>>> done
>>>
>>>
>>> The rack_info.txt file contains all hostname AND IP addresses for each
>>> node:
>>> 10.10.10.10  /dc1/rack1
>>> 10.10.10.11  /dc1/rack2
>>> datanode1  /dc1/rack1
>>> datanode2  /dc1/rack2
>>> .. etch.
>>>
>>>
>>> On Wed, May 8, 2013 at 1:38 PM, Adam Faris <[EMAIL PROTECTED]> wrote:
>>>
>>>> Look between the <code> blocks starting at line 1336.
>>>> http://lnkd.in/rJsqpV   Some day it will get included in the
>>>> documentation with a future Hadoop release. :)
>>>>
>>>> -- Adam
>>>>
>>>> On May 8, 2013, at 10:29 AM, Mohammad Mustaqeem <[EMAIL PROTECTED]
>>>> >
>>>>  wrote:
>>>>
>>>> > If anybody have sample (topology.script.file.name) script then
>>>> please share it.
>>>> >
>>>> >
>>>> > On Wed, May 8, 2013 at 10:30 PM, Mohammad Mustaqeem <
>>>> [EMAIL PROTECTED]> wrote:
>>>> > @chris, I have test it outside. It is working fine.
>>>> >
>>>> >
>>>> > On Wed, May 8, 2013 at 7:48 PM, Leonid Fedotov <
>>>> [EMAIL PROTECTED]> wrote:
>>>> > Error in script.
>>>> >
>>>> >
>>>> > On Wed, May 8, 2013 at 7:11 AM, Chris Embree <[EMAIL PROTECTED]>
>>>> wrote:
>>>> > Your script has an error in it.  Please test your script using both
>>>> IP Addresses and Names, outside of hadoop.
>>>> >
>>>> >
>>>> > On Wed, May 8, 2013 at 10:01 AM, Mohammad Mustaqeem <
>>>> [EMAIL PROTECTED]> wrote:
>>>> > I have done this and found following error in log -
>>>> >
>>>> >
>>>> > 2013-05-08 18:53:45,221 WARN
>>>> org.apache.hadoop.net.ScriptBasedMapping: Exception running
>>>> /home/mustaqeem/development/hadoop-2.0.3-alpha/etc/hadoop/rack.sh 127.0.0.1
>>>> > org.apache.hadoop.util.Shell$ExitCodeException:
>>>> /home/mustaqeem/development/hadoop-2.0.3-alpha/etc/hadoop/rack.sh: 8:
>>>> /home/mustaqeem/development/hadoop-2.0.3-alpha/etc/hadoop/rack.sh: Syntax
>>>> error: "(" unexpected (expecting "done")
>>>> >
>>>> >       at org.apache.hadoop.util.Shell.runCommand(Shell.java:202)
>>>> >       at org.apache.hadoop.util.Shell.run(Shell.java:129)
>>>> >       at
>>>> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:322)
*With regards ---*
*Mohammad Mustaqeem*,
M.Tech (CSE)
MNNIT Allahabad
9026604270