Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Best practices - Large Hadoop Cluster

Copy link to this message
Re: Best practices - Large Hadoop Cluster
On Tue, Aug 10, 2010 at 10:01 AM, Brian Bockelman <[EMAIL PROTECTED]> wrote:
> Hi Raj,
> I believe the best practice is to *not* start up Hadoop over SSH.  Set it up as a system service and let your configuration management software take care of it.
> You probably want to look at ROCKS or one of its variants, or at least something like puppet or cfEngine.
> Brian
> On Aug 10, 2010, at 8:46 AM, Raj V wrote:
>> I need to start setting up a large - hadoop cluster of 512 nodes . My biggest
>> problem is the SSH keys. Is there a simpler way of generating and exchanging ssh
>> keys among the nodes? Any best practices? If there is none, I could volunteer to
>> do it,
>> Raj

Shameless blog plug -alternative to ssh keys-