If the Hadoop consumer/producers use-case will remain relevant for Kafka
(I assume it will), it would make sense to have the core components (kafka
input/output format at least) as part of Kafka so that it could be built,
tested and versioned together to maintain compatibility.
This would also make it easier to build custom MR jobs on top of Kafka,
rather than having to decouple stuff from Camus.
Also it would also be less confusing for users at least when starting
Camus could use those instead of providing it's own.
This being said we did some work on the consumer side (0.8 and the new(er)
We could probably try to rewrite them to use Camus or fix Camus or
whatever, but please consider this alternative as well.
On 7/3/13 11:06 AM, "Sam Meder" <[EMAIL PROTECTED]> wrote: