Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Re: Issue when clicking on BrowseFileSystem


Copy link to this message
-
Re: Issue when clicking on BrowseFileSystem
Oh, you've *configured* "localhost" as your hostname in the hadoop
*.xml files. Yes, that'll result in the behavior you're seeing.

I was assuming you were using a hostname that other machines can
resolve. For example running on my laptop I use "adit420" (which is
what the laptop calls itself). When running on VMs using
kvm+virt-manager, I configure my hostnames to the hostnames of the
VMs, for example vm01.local, vm02.local, et cetera.  (The .local
pseudo-domain is managed by avahi.)

You mustn't delete the localhost line from /etc/hosts as that might
cause a lot of other stuff to fail, possibly including sshd, xorg,
Gnome, etc.

-andy

On Mon, Oct 15, 2012 at 12:24 PM, Kartashov, Andy
<[EMAIL PROTECTED]> wrote:
> Andy,
>
> My /etc/hosts does say: 127.0.0.1              localhost.localdomain localhost
> Shall I delete this entry?
>
> The only reference to localhost is in:
>
> Core-site:
>   <property>
>     <name>fs.default.name</name>
>     <value>hdfs://localhost:8020</value>
>   </property>
>
> Mapred-site
>   <property>
>     <name>mapred.job.tracker</name>
>     <value>localhost:8021</value>
>   </property>
>
> Andy Kartashov
> MPAC
> Architecture R&D, Co-op
> 1340 Pickering Parkway, Pickering, L1V 0C4
> 1 Phone : (905) 837 6269
> 6 Mobile: (416) 722 1787
> [EMAIL PROTECTED]
>
> -----Original Message-----
> From: Andy Isaacson [mailto:[EMAIL PROTECTED]]
> Sent: Friday, October 12, 2012 6:24 PM
> To: [EMAIL PROTECTED]
> Subject: Re: Issue when clicking on BrowseFileSystem
>
> On Fri, Oct 12, 2012 at 2:09 PM, Kartashov, Andy <[EMAIL PROTECTED]> wrote:
>> It displays:
>> /browseDirectory.jsp?namenodeInfoPort=50070&dir=/&nnaddr=localhost.loc
>> aldomain:8020
>
> OK, there are two clues there that your DN and NN both think of themselves as being on localhost -- the http://<datanode> portion of the URL is showing the DN address as localhost, and the nnaddr=localhost is showing the NN address as localhost.
>
> I'd check your /etc/hosts for your hostname and look at the hdfs-site.xml for any occurrences of localhost.
>
> -andy
>
>>
>>
>> Andy Kartashov
>> MPAC
>> Architecture R&D, Co-op
>> 1340 Pickering Parkway, Pickering, L1V 0C4
>> 1 Phone : (905) 837 6269
>> 6 Mobile: (416) 722 1787
>> [EMAIL PROTECTED]
>>
>>
>> -----Original Message-----
>> From: Andy Isaacson [mailto:[EMAIL PROTECTED]]
>> Sent: Friday, October 12, 2012 4:31 PM
>> To: [EMAIL PROTECTED]
>> Subject: Re: Issue when clicking on BrowseFileSystem
>>
>> On Fri, Oct 12, 2012 at 11:42 AM, Kartashov, Andy <[EMAIL PROTECTED]> wrote:
>>> You are absolutely right.  It was inded  "localhost..:"   in the URL. When I changed it to my IP address the page duly loaded. Which .xml file is responsible for this setting?
>>
>> It depends on which URL, you left out that part of the answer. :) When you click the link on dfsHealth.jsp, does the error page have nn_browsedfscontent.jsp in the URL bar, or does it have browseDirectory.jsp in the URL bar?
>>
>> The NN address can be specified with dfs.namenode.http-address and the DN with dfs.datanode.http.address but you shouldn't need to configure those by hand, the code is supposed to be smart enough to figure it out automatically. It works in my Hadoop 2.0.x configurations here, so I'd like to track down what is different about your configuration in case we need to fix the code to DTRT for that kind of configuration.
>>
>> The first thing I would check is to make sure your server name is not
>> listed as 127.0.1.1 in /etc/hosts on the server. Since this (arguably
>> broken) configuration is the default for some distros, it would be nice if Hadoop could handle it automatically, but in my experience things work much better when I delete that line from /etc/hosts.
>>
>> -andy
>>
>>> Cheers,
>>> Andy
>>>
>>>
>>> -----Original Message-----
>>> From: Andy Isaacson [mailto:[EMAIL PROTECTED]]
>>> Sent: Friday, October 12, 2012 2:10 PM
>>> To: [EMAIL PROTECTED]
>>> Subject: Re: Issue when clicking on BrowseFileSystem
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB