Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # dev >> development environment for hadoop core

Copy link to this message
Re: development environment for hadoop core
Hi Erik,

When I started out on Hadoop development, I used to use emacs for most of
my development. I eventually "saw the light" and switched to eclipse with a
bunch of emacs keybindings - using an IDE is really handy in Java for
functions like "find callers of", quick navigation to types, etc. etags
gets you part of the way, but I'm pretty sold on eclipse at this point. The
other big advantage I found of Eclipse is that the turnaround time on
running tests is near-instant - make a change, hit save, and run a unit
test in a second or two, instead of waiting 20+sec for maven (even on a
non-clean build).

That said, for quick fixes or remote debugging work I fall back to vim
pretty quickly.


On Tue, Jan 15, 2013 at 3:50 PM, Erik Paulson <[EMAIL PROTECTED]> wrote:

> Hello -
> I'm curious what Hadoop developers use for their day-to-day hacking on
> Hadoop. I'm talking changes to the Hadoop libraries and daemons, and not
> developing Map-Reduce jobs or using using the HDFS Client libraries to talk
> to a filesystem from an application.
> I've checked out Hadoop, made minor changes and built it with Maven, and
> tracked down the resulting artifacts in a target/ directory that I could
> deploy. Is this typically how a cloudera/hortonworks/mapr/etc dev works, or
> are the IDEs more common?
> I realize this sort of sounds like a dumb question, but I'm mostly curious
> what I might be missing out on if I stay away from anything other than vim,
> and not being entirely sure where maven might be caching jars that it uses
> to build, and how careful I have to be to ensure that my changes wind up in
> the right places without having to do a clean build every time.
> Thanks!
> -Erik

Todd Lipcon
Software Engineer, Cloudera