TL;DR: I propose to create official hadoop images and upload them to the

GOAL/SCOPE: I would like improve the existing documentation with
easy-to-use docker based recipes to start hadoop clusters with various

The images also could be used to test experimental features. For example
ozone could be tested easily with these compose file and configuration:

Or even the configuration could be included in the compose file:

I would like to create separated example compose files for federation,
ha, metrics usage, etc. to make it easier to try out and understand the

CONTEXT: There is an existing Jira
But it’s about a tool to generate production quality docker images
(multiple types, in a flexible way). If no objections, I will create a
separated issue to create simplified docker images for rapid prototyping
and investigating new features. And register the branch to the dockerhub
to create the images automatically.

MY BACKGROUND: I am working with docker based hadoop/spark clusters
quite a while and run them succesfully in different environments
(kubernetes, docker-swarm, nomad-based scheduling, etc.) My work is
available from here: but they could handle
more complex use cases (eg. instrumenting java processes with btrace, or
read/reload configuration from consul).
  And IMHO in the official hadoop documentation it’s better to suggest
to use official apache docker images and not external ones (which could
be changed).

Please let me know if you have any comments.


NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB