-Re: Re: Re: RegionServer unable to connect to master
shashwat shriparv 2011-12-16, 08:22
check if password less ssh is enabled for local server or not. if you have
already created it try to remove .ssh folder in /home and try recreating
.some time its problem related to this also. just a wild guess. have a look
Hadoop requires SSH access to manage its nodes, i.e. remote machines plus
your local machine if you want to use Hadoop on it (which is what we want
to do in this short tutorial). For our single-node setup of Hadoop, we
therefore need to configure SSH access to localhost for the hadoop user we
create in the previous section.
I assume that you have SSH up and running on your machine and configured it
to allow SSH public key authentication. If not, there are several
First, we have to generate an SSH key for the <tt>hadoop</tt> user.
noll@ubuntu:~$ su - hadoop
hadoop@ubuntu:~$ ssh-keygen -t rsa -P ""
Generating public/private rsa key pair.
Enter file in which to save the key (/home/hadoop/.ssh/id_rsa):
Created directory '/home/hadoop/.ssh'.
Your identification has been saved in /home/hadoop/.ssh/id_rsa.
Your public key has been saved in /home/hadoop/.ssh/id_rsa.pub.
The key fingerprint is:
The second line will create an RSA key pair with an empty password.
Generally, using an empty password is not recommended, but in this case it
is needed to unlock the key without your interaction (you don't want to
enter the passphrase every time Hadoop interacts with its nodes).
Second, you have to enable SSH access to your local machine with this newly
hadoop@ubuntu:~$ cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys
The final step is to test the SSH setup by connecting to your local machine
with the hadoop user. The step is also needed to save your local machine's
host key fingerprint to the hadoop user'sknown_hosts file. If you have any
special SSH configuration for your local machine like a non-standard SSH
port, you can define host-specific SSH options in $HOME/.ssh/config (see
man ssh_config for more information).
hadoop@ubuntu:~$ ssh localhost
The authenticity of host 'localhost (127.0.0.1)' can't be established.
RSA key fingerprint is 76:d7:61:86:ea:86:8f:31:89:9f:68:b0:75:88:52:72.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added 'localhost' (RSA) to the list of known hosts.
If the SSH connect should fail, these general tips might help:
Enable debugging with ssh -vvv localhost and investigate the error in
Check the SSH server configuration in /etc/ssh/sshd_config, in
particular the options PubkeyAuthentication (which should be set to yes)
and AllowUsers (if this option is active, add the <tt>hadoop</tt> user
to it). If you made any changes to the SSH server configuration file, you
can force a configuration reload with sudo /etc/init.d/ssh reload.
I have not found out yet how to configure Hadoop to listen on *all
IPv4) network interfaces. Using 0.0.0.0 for the various networking-related
Hadoop configuration options will result in Hadoop binding to the
on my Ubuntu box.
As a workaround (and realizing that there's no practical point in enabling
IPv6 on a box when you are not connected to any IPv6 network), I simply
disabled IPv6 on my Ubuntu machine.
To disable IPv6 on Ubuntu Linux, open /etc/modprobe.d/blacklist in the
editor of your choice and add the following lines to the end of the file:
# disable IPv6
You have to reboot your machine in order to make the changes take effect.
2011/12/16 exp <[EMAIL PROTECTED]>
> hi Mohammad Tariq,
> thanks for reply.
> I follow your instruction, change the hosts to this:
> 127.0.0.1 localhost 127.0.0.1 localhost ubuntu 10.66.201.243 master
> 10.66.201.244 slave1 10.66.201.245 slave2
width="728" height="90" scrolling="no" border="0" marginwidth="0"