Indeed, there has been no mention that the LinkedIn Kafka/Hadoop ETL code
has been released.
I'm glad to see that the little script I made is useful for others :) ...
If you want to consume Binary Avro and write it straight into Hadoop, you
should be able to use the regular hadoop-consumer contrib (or the
incremental-consumer, which is just a wrapper for the hadoop-consumer
On Mon, Feb 13, 2012 at 4:18 AM, Sebastian Eichner <
[EMAIL PROTECTED]> wrote:
> We want to use Kafka to transport Avro messages, store them in
> Avro-Format on HDFS and process then with Hadoop.
> So far i have the Hadoop-Incremental-Consumer from Felix and a simple
> Avro-Producer working. My next step would be to make the
> Hadoop-Consumer receive Binary Avro and store it in files, but before
> diving into this i wanted to ask:
> Is there any other code for this scenario already available? So far i
> could not find anything from the list archives and google. In the
> archives i read that LinkedIn does something similiar but afaik it's
> not yet released.