Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Kafka >> mail # user >> Flexible Producers and Consumers


Copy link to this message
-
Re: Flexible Producers and Consumers
We do have a console producer that allows you to pipe in log data in string
format and a log4j Kafka appender. It would be great if people can
contribute other types of adapters.

Thanks,

Jun
On Tue, Jul 16, 2013 at 11:31 AM, Danny Patel <[EMAIL PROTECTED]> wrote:

> I've read through many sites regarding using Kafka for log aggregation but
> haven't really found anything that actually talks about how people are
> shipping their logs into Kafka and consuming them. I'm really interested in
> implementation that would watch any kind of logs (local syslogs and
> application logs) and ship them near realtime into kafka. I think products
> like logstash and flume really shine in this area as they have multitude of
> options to ship any data stream into central aggregation service.
>
> Since Kafka is proclaimed to be far more scalable I'm hoping there are
> options such as (http://logstash.net/docs/1.1.13/) to be able to vacuum
> any data source and put it into kafka queues and then consume them.
>
> Any suggestions?
>

 
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB