Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Avro >> mail # user >> Java MapReduce Avro Jackson Error


+
Deepak Nettem 2012-03-19, 22:48
+
Deepak Nettem 2012-03-19, 23:20
+
Tatu Saloranta 2012-03-19, 23:27
+
Deepak Nettem 2012-03-20, 00:05
+
Deepak Nettem 2012-03-20, 00:06
+
Scott Carey 2012-03-20, 01:06
+
Tatu Saloranta 2012-03-20, 01:12
+
Deepak Nettem 2012-03-20, 01:23
+
Tatu Saloranta 2012-03-20, 01:34
+
Something Something 2012-03-20, 01:43
+
Scott Carey 2012-03-20, 03:25
+
Deepak Nettem 2012-03-27, 02:58
+
Scott Carey 2012-03-27, 03:25
+
Deepak Nettem 2012-03-29, 14:28
Copy link to this message
-
Re: Java MapReduce Avro Jackson Error

On Mar 19, 2012, at 4:20pm, Deepak Nettem wrote:

> I found that the Hadoop lib directory contains jackson-core-asl-1.0.1.jar and jackson-mapper-asl-1.0.1.jar.
>
> I removed these, but got this error:
> hadoop Exception in thread "main" java.lang.NoClassDefFoundError: org/codehaus/jackson/map/JsonMappingException

Just confirming that you restarted the Hadoop daemons after removing these older Jackson jars.

-- Ken

>
> I am using Maven as a build tool, and my pom.xml has this dependency:
>
>     <dependency>    
>     <groupId>org.codehaus.jackson</groupId>
>       <artifactId>jackson-mapper-asl</artifactId>
>       <version>1.5.2</version>
>       <scope>compile</scope>
>     </dependency>
>    
> Any help would on this issue would be greatly appreciated.
>
> Best,
> Deepak
>
> On Mon, Mar 19, 2012 at 6:48 PM, Deepak Nettem <[EMAIL PROTECTED]> wrote:
> When I include some Avro code in my Mapper, I get this error:
>
> Error: org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
>
> Particularly, just these two lines of code:
>
>             InputStream in = getClass().getResourceAsStream("schema.avsc");            
>             Schema schema = Schema.parse(in);
>
> This code works perfectly when run as a stand alone application outside of Hadoop. Why do I get this error? and what's the best way to get rid of it?
>
> I am using Hadoop 0.20.2, and writing code in the new API.

--------------------------
Ken Krugler
http://www.scaleunlimited.com
custom big data solutions & training
Hadoop, Cascading, Mahout & Solr
+
Deepak Nettem 2012-03-20, 03:04
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB