Thanks for the useful information guys. The use case does require as close
to realtime processing of the log files as possible, so undeniably useful
as a perl script in the right place is, the Morphline Solr Sink looks more
like the best fit in this case. In fact just the search term 'Complex Event
Processing' has opened up a whole new avenue of investigation.
On 26 August 2013 23:12, Wolfgang Hoschek <[EMAIL PROTECTED]> wrote:
> Take a look at the Apache Flume Morphline Solr Sink, for example for heavy
> duty ETL processing and
> ingestion into Solr:
> It provides a scripting engine that enables CEP on the flow of log events.
> On Aug 26, 2013, at 1:22 PM, Mark Nuttall-Smith wrote:
> > Hi, I posted this question on stackoverflow (
> but thought I might get a better response here, so am crossposting... hope
> it's ok!
> > I would like some design advice for a centralized logging project I am
> considering. I have a number of components producing logs on various
> servers. Apache Flume looks like the sensible choice for streaming to a
> central log server, most likely into an elasticsearch instance for querying
> and analysis.
> > Here's my question: I would like to provide a scripting engine listening
> to the flow of log events arriving on the central server. Would it make
> sense to do that as an interceptor in Flume, or as a plugin to
> elasticsearch, or something else completely?
> > Thanks,
> > Mark