Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Kafka, mail # user - default producer to retro-fit existing log files collection process?


Copy link to this message
-
Re: default producer to retro-fit existing log files collection process?
Benjamin Black 2013-09-04, 18:26
commons-logging has a log4j logger, so perhaps you just need to use it and
the log4j-kafka appender to achieve your goal?
On Tue, Sep 3, 2013 at 2:08 PM, Maxime Petazzoni
<[EMAIL PROTECTED]>wrote:

> Tomcat uses commons-logging for logging. You might be able to write an
> adapter towards Kafka, in a similar way as the log4j-kafka appender. I
> think this would be cleaner than writing something Tomcat-specific that
> intercepts your requests and logs them through Kafka.
>
> /Max
> --
> Maxime Petazzoni
> Sr. Platform Engineer
> m 408.310.0595
> www.turn.com
>
> ________________________________________
> From: Yang [[EMAIL PROTECTED]]
> Sent: Tuesday, September 03, 2013 10:09 AM
> To: [EMAIL PROTECTED]
> Subject: default producer to retro-fit existing log files collection
> process?
>
> in many setups we have production web server logs rotated on local disks,
> and then collected using some sort of scp processes.
>
> I guess the ideal way to use kafka is to write a module for tomcat and
> catches the request , send through the kafka api. but is there a "quick and
> dirty" producer included from kafka  to just read the existing rotated logs
> and send through kafka API? this would avoid having to touch the existing
> java code
>
> thanks
> Yang
>