Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Hadoop >> mail # dev >> development environment for hadoop core


+
Erik Paulson 2013-01-15, 23:50
+
Todd Lipcon 2013-01-16, 01:44
+
Andy Isaacson 2013-01-16, 02:08
+
Surenkumar Nihalani 2013-01-16, 03:38
+
Steve Loughran 2013-01-16, 08:40
+
Glen Mazza 2013-01-16, 13:31
+
Erik Paulson 2013-01-21, 16:36
Copy link to this message
-
Re: development environment for hadoop core
Hi Erik,

Eclipse can run junit tests very rapidly.  If you want a shorter test
cycle, that's one way to get it.

There is also Maven-shell, which reduces some of the overhead of starting
Maven.  But I haven't used it so I can't really comment.

cheers,
Colin
On Mon, Jan 21, 2013 at 8:36 AM, Erik Paulson <[EMAIL PROTECTED]> wrote:

> On Wed, Jan 16, 2013 at 7:31 AM, Glen Mazza <[EMAIL PROTECTED]> wrote:
>
> > On 01/15/2013 06:50 PM, Erik Paulson wrote:
> >
> >> Hello -
> >>
> >> I'm curious what Hadoop developers use for their day-to-day hacking on
> >> Hadoop. I'm talking changes to the Hadoop libraries and daemons, and not
> >> developing Map-Reduce jobs or using using the HDFS Client libraries to
> >> talk
> >> to a filesystem from an application.
> >>
> >> I've checked out Hadoop, made minor changes and built it with Maven, and
> >> tracked down the resulting artifacts in a target/ directory that I could
> >> deploy. Is this typically how a cloudera/hortonworks/mapr/etc dev works,
> >> or
> >> are the IDEs more common?
> >>
> > I haven't built Hadoop yet myself.  Your use of "a" in "a target/
> > directory" indicates you're also kind of new with Maven itself, as that's
> > the standard output folder for any Maven project.  One of many nice
> things
> > about Maven is once you learn how to build one project with it you pretty
> > much know how to build any project with it, as everything's standardized
> > with it.
> >
> > Probably best to stick with the command line for building and use Eclipse
> > for editing, to keep things simple, but don't forget the mvn
> > eclipse:eclipse command to set up Eclipse projects that you can
> > subsequently import into your Eclipse IDE:
> http://www.jroller.com/gmazza/*
> > *entry/web_service_tutorial#**EclipseSetup<
> http://www.jroller.com/gmazza/entry/web_service_tutorial#EclipseSetup>
> >
> >
> >
> >> I realize this sort of sounds like a dumb question, but I'm mostly
> curious
> >> what I might be missing out on if I stay away from anything other than
> >> vim,
> >> and not being entirely sure where maven might be caching jars that it
> uses
> >> to build,
> >>
> >
> > That will be your local Maven repository, in an .m2 hidden folder in your
> > user home directory.
> >
> >
> >
> >  and how careful I have to be to ensure that my changes wind up in
> >> the right places without having to do a clean build every time.
> >>
> >>
> > Maven can detect changes (using mvn install instead of mvn clean
> install),
> > but I prefer doing clean builds.  You can use the -Dmaven.test.skip
> setting
> > to speed up your "mvn clean installs" if you don't wish to run the tests
> > each time.
> >
>
> Thanks to everyone for their advice last week, it's been helpful.
>
> You're spot-on that I'm new to Maven, but I'm a little confused as to what
> the different targets/goals are best to use. Here's my scenario.
>
> What I'd like to get working is the DataNodeCluster, which lives in the
> tests.
>
> Running it from hadoop-hdfs-project/hadoop-hdfs/target as
> 'hadoop jar ./hadoop-hdfs-3.0.0-SNAPSHOT-tests.jar
> org.apache.hadoop.hdfs.DataNodeCluster
> -n 2'
>
> blows up with a NPE inside of MiniDFSCluster - the offending line is
> 'dfsdir = conf.get(HDFS_MINIDFS_BASEDIR, null);' (line 2078 of
> MiniDFSCluster.java)
>
> I'm not worried about being able to figure out what's wrong (I'm pretty
> sure it's that conf is still null when this gets called) - what I'm trying
> to use this as is a way to understand what gets built when.
>
> Just to check, I added a System.out.println one line before 2078 of
> MiniDFSCluster, and recompiled from hadoop-common/hadoop-hdfs-project with
>
> mvn package -DskipTests
>
> Because I don't want to run all the tests.
>
> This certainly compiles the codes - if I leave the semicolon off of my
> change the compile fails, even with -DskipTests. However, it doesn't appear
> to rebuild
>
> target/hadoop-hdfs-3.0.0-SNAPSHOT/share/hadoop/hdfs/hadoop-hdfs-3.0.0-SNAPSHOT-tests.jar
> - the timestamp is still the old version.
+
Gopal Vijayaraghavan 2013-01-16, 14:17
+
Hitesh Shah 2013-01-16, 19:18
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB