Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> Problem with Hadoop and /etc/hosts file


Copy link to this message
-
Re: Problem with Hadoop and /etc/hosts file
make sure that same content is there on different machines in your cluster
copy this

127.0.0.1         localhost
10.220.55.41    skil01
10.220.55.42    skil02
10.220.55.40    skil03

and paste it in machine skill01 skill02 skill03 and then try. and also
check if these ips are correct.

Regards


Shashwat Shriparv

On Mon, Sep 17, 2012 at 1:09 PM, Alberto Cordioli <
[EMAIL PROTECTED]> wrote:

> How can I set my rDNS?
> Anyway this is the /etc/hosts file on my hosts:
>
> 127.0.0.1               localhost
> 10.220.55.41    skil01
> 10.220.55.42    skil02
> 10.220.55.40    skil03
>
> The file /etc/hostname contains only a row with the name of the
> current host. For example on skil01 it contains:
> skil01
>
>
> Alberto
>
>
>
> On 17 September 2012 07:47, shashwat shriparv <[EMAIL PROTECTED]>
> wrote:
> > cnn you send the content of your hostname and hosts file content?
> >
> > Regards
> >
> > ∞
> > Shashwat Shriparv
> >
> >
> >
> > On Mon, Sep 17, 2012 at 11:09 AM, Michel Segel <
> [EMAIL PROTECTED]>wrote:
> >
> >> Just a hunch, w DNS, do you have your rDNS (reverse DNS lookup) set up
> >> correctly.
> >>
> >> Sent from a remote device. Please excuse any typos...
> >>
> >> Mike Segel
> >>
> >> On Sep 15, 2012, at 8:04 PM, Alberto Cordioli <
> [EMAIL PROTECTED]>
> >> wrote:
> >>
> >> > This is the configuration I used till now...It works, but give the
> >> > mentioned error (although the procedure seems to return correct
> >> > results anyway.
> >> > I think in /etc/hosts should be also the line
> >> > 127.0.0.1 hostname
> >> >
> >> > but in that case Hadoop does not start.
> >> >
> >> > Alberto
> >> >
> >> > On 14 September 2012 18:19, Shumin Wu <[EMAIL PROTECTED]> wrote:
> >> >> Would that work for you?
> >> >>
> >> >> 127.0.0.1        localhost
> >> >> 10.220.55.41  hostname
> >> >>
> >> >> -Shumin
> >> >>
> >> >> On Fri, Sep 14, 2012 at 6:18 AM, Alberto Cordioli <
> >> >> [EMAIL PROTECTED]> wrote:
> >> >>
> >> >>> Hi,
> >> >>>
> >> >>> I've successfully installed Apache HBase on a cluster with Hadoop.
> >> >>> It works fine, but when I try to use Pig to load some data from an
> >> >>> HBase table I get this error:
> >> >>>
> >> >>> ERROR org.apache.hadoop.hbase.mapreduce.TableInputFormatBase -
> Cannot
> >> >>> resolve the host name for /10.220.55.41 because of
> >> >>> javax.naming.OperationNotSupportedException: DNS service refused
> >> >>> [response code 5]; remaining name '41.55.220.10.in-addr.arpa'
> >> >>>
> >> >>> Pig returns in any case the correct results (actually I don't know
> >> >>> how), but I'd like to solve this issue.
> >> >>>
> >> >>> I discovered that this error is due to a mistake in /etc/hosts
> >> >>> configuration file. In fact, as reported in the documentation, I
> >> >>> should add the line
> >> >>> 127.0.0.1    hostname
> >> >>> (http://hbase.apache.org/book.html#os).
> >> >>>
> >> >>> But if I add this entry my Hadoop cluster does not start since the
> >> >>> datanote is bind to the local address instead to the hostname/IP
> >> >>> address. For this reason in many tutorial it's suggested to remove
> >> >>> such entry (e.g.
> >> >>>
> >> >>>
> >>
> http://stackoverflow.com/questions/8872807/hadoop-datanodes-cannot-find-namenode
> >> >>> ).
> >> >>>
> >> >>> Basically if I add that line Hadoop won't work, but if I keep the
> file
> >> >>> without the loopback address I get the above error.
> >> >>> What can I do? Which is the right configuration?
> >> >>>
> >> >>>
> >> >>> Thanks,
> >> >>> Alberto
> >> >>>
> >> >>>
> >> >>>
> >> >>>
> >> >>> --
> >> >>> Alberto Cordioli
> >> >>>
> >> >
> >> >
> >> >
> >> > --
> >> > Alberto Cordioli
> >> >
> >>
> >
> >
> >
> > --
> >
> >
> > ∞
> > Shashwat Shriparv
>
>
>
> --
> Alberto Cordioli
>

--

Shashwat Shriparv