Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce >> mail # user >> what are the core jars needed to compile a job in Hadoop 2.0.2 Alpha


+
anand sharma 2012-12-15, 15:41
Copy link to this message
-
Re: what are the core jars needed to compile a job in Hadoop 2.0.2 Alpha
If you are compiling in the old-world way (javac!) it would be simpler
to use the whole classpath, given the modularity of jars, such as:

$ javac -cp `hadoop classpath` -d wordcount_classes WordCount.java

Where the command `hadoop classpath` in the shell automatically
expands and provides a usable classpath.

If you want a better approach, I highly recommend using Apache Maven
or similar tools with the hadoop-client dependency added instead.

On Sat, Dec 15, 2012 at 9:11 PM, anand sharma <[EMAIL PROTECTED]> wrote:
> Hi please can someone let me know what are are core jar files and depeancies
> we need to attach in classpath for a job to compile successfully from a java
> source.
>
> say..
>
> javac -cp classpath -d wordcount_classes WordCount.java
>
> here what will be jar files for it to compile successfully.
>
> Thanks

--
Harsh J
+
anand sharma 2012-12-16, 03:28
+
anand sharma 2012-12-16, 12:45
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB