Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Hadoop, mail # user - how to resolve conflicts with jar dependencies


+
Jane Wayne 2013-03-12, 13:49
+
Luke Lu 2013-03-12, 23:03
Copy link to this message
-
Re: how to resolve conflicts with jar dependencies
Jane Wayne 2013-03-13, 15:19
thanks luke. that was informative.

unfortunately for me, we are still on hadoop v0.20.2.

if you or anyone still have any feedback, given the new information,
on how to resolve dependency conflicts for hadoop v0.20.x, please let
me know.

your help is greatly appreciated.

On Tue, Mar 12, 2013 at 7:03 PM, Luke Lu <[EMAIL PROTECTED]> wrote:
> The problem is resolved in the next release of hadoop (2.0.3-alpha cf.
> MAPREDUCE-1700)
>
> For hadoop 1.x based releases/distributions, put
> -Dmapreduce.user.classpath.first=true on the hadoop command line and/or
> client config
>
>
> On Tue, Mar 12, 2013 at 6:49 AM, Jane Wayne <[EMAIL PROTECTED]>wrote:
>
>> hi,
>>
>> i need to know how to resolve conflicts with jar dependencies.
>>
>> * first, my job requires Jackson JSON-processor v1.9.11.
>> * second, the hadoop cluster has Jackson JSON-processor v1.5.2. the
>> jars are installed in $HADOOP_HOME/lib.
>>
>> according to this link,
>>
>> http://blog.cloudera.com/blog/2011/01/how-to-include-third-party-libraries-in-your-map-reduce-job/
>> ,
>> there are 3 ways to include 3rd party libraries in a map/reduce (mr)
>> job.
>> * use the -libjars flag
>> * include the dependent libraries in the executing jar file's /lib
>> directory
>> * put the jars in the $HADOOP_HOME/lib directory
>>
>> i can report that using -libjars and including the libraries in my
>> jar's /lib directory "do not work" (in my case of jar conflicts). i
>> still get a NoSuchMethodException. the only way to get my job to run
>> is the last option, placing the newer jars in $HADOOP_HOME/lib. the
>> last option is fine on a sandbox or development instance, but there
>> are some political difficulties (not only technical) in modifying our
>> production environment.
>>
>> my questions/concerns are:
>> 1. how come the -libjars and /lib options do not work? how does class
>> loading work in mr tasks?
>> 2. is there another option available that i am not aware of to try and
>> get dependent jars by the job to "overwrite" what's in
>> $HADOOP_HOME/lib at runtime of the mr tasks?
>>
>>
>> any help is appreciated. thank you all.
>>