Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> Configuring SSH - is it required? for a psedo distriburted mode?


Copy link to this message
-
Re: Configuring SSH - is it required? for a psedo distriburted mode?
Hello Raj,

     ssh is actually 2 things :
1- ssh : The command we use to connect to remote machines - the client.
2- sshd : The daemon that is running on the server and allows clients to
connect to the server.
ssh is pre-enabled on Linux, but in order to start sshd daemon, we need to
install ssh first.

To start the Hadoop daemons you have to make ssh passwordless and issue
bin/start-dfs.sh and bin/start-mapred.sh.

You might find this
link<http://cloudfront.blogspot.in/2012/07/how-to-setup-and-configure-ssh-on-ubuntu.html#.UZUCkUAW38s>
 useful.

Warm Regards,
Tariq
cloudfront.blogspot.com
On Thu, May 16, 2013 at 9:26 PM, Raj Hadoop <[EMAIL PROTECTED]> wrote:

>  Hi,
>
> I am a bit confused here. I am planning to run on a single machine.
>
> So what should i do to start hadoop processes. How should I do an SSH? Can
> you please breifly explain me what SSH is?
>
> Thanks,
> Raj
>   *From:* Jay Vyas <[EMAIL PROTECTED]>
> *To:* "[EMAIL PROTECTED]" <[EMAIL PROTECTED]>
> *Cc:* Raj Hadoop <[EMAIL PROTECTED]>
> *Sent:* Thursday, May 16, 2013 11:34 AM
> *Subject:* Re: Configuring SSH - is it required? for a psedo distriburted
> mode?
>
>  Actually, I should amend my statement -- SSH is required, but
> passwordless ssh (i guess) you can live without if you are willing to enter
> your password for each process that gets started.
>
> But Why wouldn't you want to implement passwordless ssh in a pseudo
> distributed cluster ?  Its very easy to implement on a single node:
>
> cat ~/.ssh/id_rsa.pub /root/.ssh/authorized_keys
>
>
>
>
> On Thu, May 16, 2013 at 11:31 AM, Jay Vyas <[EMAIL PROTECTED]> wrote:
>
> Yes it is required -- in psuedodistributed node the jobtracker is not
> necessarily aware that the task trackers  / data nodes are on the same
> machine, and will thus attempt to ssh into them when starting the
> respective deamons etc (i.e. start-all.sh)
>
>
> On Thu, May 16, 2013 at 11:21 AM, kishore alajangi <
> [EMAIL PROTECTED]> wrote:
>
>  When you start the hadoop procecess, each process will ask the password
> to start, to overcome this we will configure SSH if you use single node or
> multiple nodes for each process, if you can enter the password for each
> process Its not a mandatory even if you use multiple systems.
>
> Thanks,
> Kishore.
>
>
> On Thu, May 16, 2013 at 8:24 PM, Raj Hadoop <[EMAIL PROTECTED]> wrote:
>
>   Hi,
>
> I have a dedicated user on Linux server for hadoop. I am installing it in
> psedo distributed mode on this box. I want to test my programs on this
> machine. But i see that in installation steps - they were mentioned that
> SSH needs to be configured. If it is single node, I dont require it
> ...right? Please advise.
>
> I was looking at this site
>
> http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
>
> It menionted like this -
> "
> Hadoop requires SSH access to manage its nodes, i.e. remote machines plus
> your local machine if you want to use Hadoop on it (which is what we want
> to do in this short tutorial). For our single-node setup of Hadoop, we
> therefore need to configure SSH access to localhost for the hduser user
> we created in the previous section.
> "
>
> Thanks,
> Raj
>
>
>
>
>
>
> --
> Jay Vyas
> http://jayunit100.blogspot.com/
>
>
>
>
> --
> Jay Vyas
> http://jayunit100.blogspot.com/
>
>
>