Frank Grimes 2012-01-26, 19:36
A simple hack would be this:
Write up a simple driver class as you would for regular, non-Oozie MR
code, configure your avro job as normal, and then instead of having
the submit API bits at the end, try:
This will give you a simple job config XML dump with the configs avro
has put in, and you can now port them into your <map-reduce> action as
config elements/etc.. Should be fairly trivial to port.
Will this work for you?
Please do share a template MR avro job workflow on the list if you've
got one in the end, in spirit of http://xkcd.com/979/ :)
(Hat tip to Alejandro from Oozie for the tip)
On Fri, Jan 27, 2012 at 1:06 AM, Frank Grimes <[EMAIL PROTECTED]> wrote:
> Hi All,
> We're trying to evaluate using Oozie (http://incubator.apache.org/oozie/) to
> run Hadoop MapReduce jobs over Avro data.
> As far as I can tell, Oozie configures the JobConf it submits to Hadoop
> based on external config files.
> see e.g.
> http://mail-archives.apache.org/mod_mbox/avro-user/201110.mbox/<[EMAIL PROTECTED]>
> I'm wondering if anybody out there has an example of how to setup/run an
> Avro MapReduce job without relying on the AvroJob.set* helper methods.
> Or better yet, an Oozie example of the same.
> Frank Grimes
Customer Ops. Engineer, Cloudera