-How to package multiple jars for a Hadoop job
Mark Kerzner 2011-02-18, 22:18
I have a script that I use to re-package all the jars (which are output in a
dist directory by NetBeans) - and it structures everything correctly into a
single jar for running a MapReduce job. Here it is below, but I am not sure
if it is the best practice. Besides, it hard-codes my paths. I am sure that
there is a better way.
# to be run from the project directory
jar -xf MR.jar
jar -cmf META-INF/MANIFEST.MF /home/mark/MR.jar *
echo "Repackaged for Hadoop"