Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume >> mail # user >> Re: Automatic log analysis and alert generation


Copy link to this message
-
Re: Automatic log analysis and alert generation
Thanks for the useful information guys. The use case does require as close
to realtime processing of the log files as possible, so undeniably useful
as a perl script in the right place is, the Morphline Solr Sink looks more
like the best fit in this case. In fact just the search term 'Complex Event
Processing' has opened up a whole new avenue of investigation.

Cheers,
Mark
On 26 August 2013 23:12, Wolfgang Hoschek <[EMAIL PROTECTED]> wrote:

> Take a look at the Apache Flume Morphline Solr Sink, for example for heavy
> duty ETL processing and
> ingestion into Solr:
>
>        http://flume.apache.org/FlumeUserGuide.html#morphlinesolrsink
>
> It provides a scripting engine that enables CEP on the flow of log events.
>
> Wolfgang.
>
> On Aug 26, 2013, at 1:22 PM, Mark Nuttall-Smith wrote:
>
> > Hi, I posted this question on stackoverflow (
> http://stackoverflow.com/questions/18448218/automatic-log-analysis-and-alert-generation),
> but thought I might get a better response here, so am crossposting... hope
> it's ok!
> >
> > I would like some design advice for a centralized logging project I am
> considering. I have a number of components producing logs on various
> servers. Apache Flume looks like the sensible choice for streaming to a
> central log server, most likely into an elasticsearch instance for querying
> and analysis.
> >
> > Here's my question: I would like to provide a scripting engine listening
> to the flow of log events arriving on the central server. Would it make
> sense to do that as an interceptor in Flume, or as a plugin to
> elasticsearch, or something else completely?
> >
> > Thanks,
> >
> > Mark
> >
> >
>
>