Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume >> mail # user >> ElasticSearch logstash serializer fields stored as strings

Copy link to this message
ElasticSearch logstash serializer fields stored as strings
Hello all,

I've been using the ElasticSearchLogStashEventSerializer with pretty great
success. It's great that it defaults to strings for all the headers, but in
some cases I would rather my data be stored as a float in ElasticSearch.
I've been digging through the code for the serializer, but I'm not so great
with java. I noticed that in the dynamic serializer the following comment
is there:

 * A best effort will be used to determine the content-type, if it cannot be
 * determined fields will be indexed as Strings

So I looked at the code for appending headers and saw this snippet:

    Map<String, String> headers = event.getHeaders();
    for (String key : headers.keySet()) {
      ContentBuilderUtil.appendField(builder, key,

In the logstash serializer the fields portion gets set by:

    for (String key : headers.keySet()) {
      byte[] val = headers.get(key).getBytes(charset);
      ContentBuilderUtil.appendField(builder, key, val);

Will either of these snippets work when the value of a header is a float?
If so I'd like to give it a try. The reason I'm doing all of this is to
take advantage of some Kibana features which require number based fields
(like graphing response times over time).