You can use Kafka to store data on Hadoop via the Hadoop consumer in
contrib, and then use Talend or Pig to ETL it, before finally emitting the
ETL's records via the Hadoop producer in contrib.

Russell Jurney

On Jan 6, 2013, at 2:29 PM, David Arthur <[EMAIL PROTECTED]> wrote:

Storm has support for Kafka, if that's the sort of thing you're looking
for. Maybe you could describe your use case a bit more?

On Sunday, January 6, 2013, Guy Doulberg wrote:

I am looking for an ETL tool that can connect to kafka, as a consumer and

as a producer,
Have you heard of such a tool?

David Arthur

NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB