Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume >> mail # user >> Dynamic Key=Value Parsing with an Interceptor?

Copy link to this message
Dynamic Key=Value Parsing with an Interceptor?
Hey we'd like to set up a default format for all of our logging systems...
perhaps looking like this:


With this pattern, we'd allow developers to define any key/value pairs they
want to log, and separate them with a common separator.

If we did this, what do we need to do in Flume to get Flume to parse out
the key=value pairs into dynamic headers? We pass our data from Flume into
both HDFS and ElasticSearch sinks. We would really like to have these
fields dynamically sent to the sinks for much easier parsing and analysis

Any thoughts on this? I know that we can define a unique interceptor for
each service that looks for explicit field names ... but thats a nightmare
to manage. I really want something truly dynamic.

Matt Wise
Sr. Systems Architect