-S3 Consumer for Super Duper Blog Post!
Russell Jurney 2012-08-18, 04:49
Ok, this is the last time I'm gonna beg for an S3 sink for Kafka. I'm
not trolling, and this is Your Big Chance to help!
I'm gonna blog about using Whirr to boot Zookeeper and then to boot
Kafka in the cloud and then create events in an application that get
sunk to Amazon S3, where they will be processed by
Pig/Hadoop/ElasticMapReduce, mined into gems and republished in some
esoteric NoSQL DB and then served in the very app that generated the
events in the first place.
So, if someone else doesn't contribute an S3 consumer for Kafka in the
next month or so... so help me Bob, I'm gonna write it myself. Now,
some of you may not know me, but I am the 3rd best software engineer
in the world: http://www.quora.com/Who-are-some-of-the-best-software-engineers-alive
Those of you that have seen my code, however, are aware that as a
programmer, I am substandard. There's a gene that imparts exception
handling and algorithms, and they're missing from my genome.
So let me be clear: you don't want me to write the S3 sink. A Kafka
committer or someone with a real job should write the S3 sink. As soon
as that thing is written and my blog post goes out, Kafka use will
spike and you'll all be famous.
So this is a direct threat: I am writing an S3 consumer for Kafka
unless one of you steps up. And you will rue the day that piece of
In return for your contribution, you will be named in my blog post as
open source citizen of the month, to be accompanied by a commemorative
plaque with a pixelated photo of me.
Russell Jurney http://datasyndrome.com
Matthew Rathbone 2012-08-18, 16:20
Russell Jurney 2012-08-19, 14:09
Russell Jurney 2012-08-19, 14:10
Matthew Rathbone 2012-08-20, 16:10
Niek Sanders 2012-08-24, 02:46
Parviz deyhim 2012-10-05, 03:43