to deploy software I suggest pulp:
For a package-based distro (debian, redhat, centos) you can build apache's
hadoop, pack it and delpoy. Configs, as Cos say, over puppet. If you use a
redhat / centos take a look at spacewalk.
On Mon, Dec 5, 2011 at 8:20 PM, Konstantin Boudnik <[EMAIL PROTECTED]> wrote:
> These that great project called BigTop (in the apache incubator) which
> provides for building of Hadoop stack.
> The part of what it provides is a set of Puppet recipes which will allow
> to do exactly what you're looking for with perhaps some minor corrections.
> Serious, look at Puppet - otherwise it will be a living through nightmare
> configuration mismanagements.
> On Mon, Dec 05, 2011 at 04:02PM, praveenesh kumar wrote:
> > Hi all,
> > Can anyone guide me how to automate the hadoop installation/configuration
> > process?
> > I want to install hadoop on 10-20 nodes which may even exceed to 50-100
> > nodes ?
> > I know we can use some configuration tools like puppet/or shell-scripts ?
> > Has anyone done it ?
> > How can we do hadoop installations on so many machines parallely ? What
> > the best practices for this ?
> > Thanks,
> > Praveenesh
*P **Think of the environment: please don't print this email unless you
really need to.*