I am using simple NetBeans scripts which I am augmenting a little, but it
seems I need to use Maven anyway.
On Sun, Feb 20, 2011 at 8:22 PM, Jun Young Kim <[EMAIL PROTECTED]> wrote:
> There is a maven plugin to package for a hadoop.
> I think this is quite convenient tool to package for a hadoop.
> if you are using it, add this one to your pom.xml
> Junyoung Kim ([EMAIL PROTECTED])
> On 02/19/2011 07:23 AM, Eric Sammer wrote:
>> You have a few options. You can:
>> 1. Package dependent jars in a lib/ directory of the jar file.
>> 2. Use something like Maven's assembly plugin to build a self contained
>> Either way, I'd strongly recommend using something like Maven to build
>> artifacts so they're reproducible and in line with commonly used tools.
>> packaging files tends to be error prone. This is less of a Hadoop-ism and
>> more of a general Java development issue, though.
>> On Fri, Feb 18, 2011 at 5:18 PM, Mark Kerzner<[EMAIL PROTECTED]>
>>> I have a script that I use to re-package all the jars (which are output
>>> dist directory by NetBeans) - and it structures everything correctly into
>>> single jar for running a MapReduce job. Here it is below, but I am not
>>> if it is the best practice. Besides, it hard-codes my paths. I am sure
>>> there is a better way.
>>> # to be run from the project directory
>>> cd ../dist
>>> jar -xf MR.jar
>>> jar -cmf META-INF/MANIFEST.MF /home/mark/MR.jar *
>>> cd ../bin
>>> echo "Repackaged for Hadoop"
>>> Thank you,