Yup, I hate that when it happens.
You tend to see this more with Avro than anything else.
The issue is that in Java, the first class loaded wins. So when Hadoop loads 1.4 first, you can't unload it and replace it with 1.7.
The only solution that we found to be workable is to replace the jars on all of the nodes in the cluster with the latest.
The only drawback is if there are incompatibilities between releases.
The other option is to roll back to the early version. (Yeah I know. You need that method, which is why I made the first recommendation. ;-)
On Oct 3, 2012, at 6:21 AM, Ben Rycroft <[EMAIL PROTECTED]> wrote:
> Hi all,
> I have a jar that uses the Hadoop API to launch various remote mapreduce jobs (ie, im not using the command-line to initiate the job). The service jar that executes the various jobs is built with maven's "jar-with-dependencies".
> My jobs all run fine except one that uses commons-codec 1.7, I get:
> FATAL org.apache.hadoop.mapred.Child: Error running child : java.lang.NoSuchMethodError: org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
> I think this is because my jar is including commons-codec 1.7 whereas my Hadoop install's lib has commons-codec 1.4 ...
> Is their any way to instruct Hadoop to use the distributed commons-codec 1.7 (I assume this is distributed as a job dependency) rather than the commons-codec 1.4 in the hadoop 1.0.3 core lib?
> Removing commons-codec-1.4.jar from my Hadoop library folder did seem to solve the problem for a bit, but is not working on another VM. Replacing the 1.4 jar with the 1.7 does seem to fix the problem but this doesn't seem too sane. Hopefully there is a better alternative.