Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume, mail # user - Events being cut by flume


Copy link to this message
-
Re: Events being cut by flume
ZORAIDA HIDALGO SANCHEZ 2013-09-09, 12:08
One more question. From Flume 1.4 documentation:
deserializer.maxLineLength      2048    Maximum number of characters to include in a single event. If a line exceeds this length, it is truncated, and the remaining characters on the line will appear in a subsequent event.

By then I need to specify values like 300 ? If I do not specify, my events get truncated.
Thanks,

Zoraida.-
De: zoraida <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>>
Responder a: Flume User List <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>>
Fecha: domingo, 8 de septiembre de 2013 21:05
Para: Flume User List <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>>
Asunto: Re: Events being cut by flume

Good news:
The 9 lines begin cut were because the maxLineLenght(when truncated, they are added below as a different event). Great, so, definitely, I can deal with those files and flume and the final configuration is:

  *   agent.sources.rpb.inputCharset = ISO-8859-1
  *   agent.sources.rpb.deserializer.maxLineLength = 300
  *   agent.sources.rpb.deserializer.outputCharset = UTF-8

By the way, it works without removing the BOM character or converting to ISO-8859-1. Was a problem of indicating to Flume the right encode(and iconv was the tool that I used to "discover" it).
Hope it helps to someone.

Zoraida.-

De: zoraida <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>>
Responder a: Flume User List <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>>
Fecha: domingo, 8 de septiembre de 2013 19:52
Para: Flume User List <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>>
Asunto: Re: Events being cut by flume

Hi all,

I have made some advances:

  *   First, I have remove the BOM character. I don't know for which reason they were adding F0FF BOM, which means UTF-16BE whereas the command "file –b" says the file is UTF-8.
  *   I have used iconv to convert the files to the encoding I suspected  they are: cat file.csv | iconv -c -f UTF-8 -t ISO-8859-1 >> file.csv.iso
  *   I run flume with this configuration:
     *   agent.sources.rpb.inputCharset = ISO-8859-1
     *   agent.sources.rpb.deserializer.maxLineLength = 300
     *   agent.sources.rpb.deserializer.outputCharset = UTF-8

The resulting file has all the events on the original file! However, some lines have been added. Usign diff, I have seen that it happens that flume is splitting some events into two different lines(only 9 from 180000 but still). The other thing I have observed is that the resulting file contains ^M character(no, the one obtained by using iconv does not contains it).

Wow, big mess… any idea?
Thanks.

De: zoraida <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>>
Responder a: Flume User List <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>>
Fecha: jueves, 5 de septiembre de 2013 17:57
Para: Flume User List <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>>
Asunto: Re: Events being cut by flume

One more thing, I have read a thread (the last answer)  but I am not able to find .flumespool-main.meta into .flumespool dir(which is created but remains empty all the time):

https://groups.google.com/a/cloudera.org/forum/#!topic/cdh-user/eIEOwfKyaX0

Any idea?

Thanks

De: zoraida <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>>
Responder a: Flume User List <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>>
Fecha: jueves, 5 de septiembre de 2013 17:34
Para: Flume User List <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>>
Asunto: Re: Events being cut by flume

Hi Israel,

sorry for the delay. I tried your suggestion but still does not work. I have notice that if I do not specify the input/output encoding, the error is the same(always stops in the same event cutting it at the same character and stop of processing the rest of the file). However, comparing the resulting file with the one that we get when specifying enconding we have note that there are some differences. Specifically, the are some events that are spllited into two events because some break line is introduced(this happens when specifying the encoding). It looks like our files are not UTF-8 but the OS recognize them as UTF-8(some of them have BOM and others not). However, flume does not recognize them as UTF-8 because some weird character.

Thanks for your help, any other suggestion will be very appreciated.

De: Israel Ekpo <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>>
Responder a: Flume User List <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>>
Fecha: martes, 27 de agosto de 2013 17:53
Para: Flume User List <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>>
Asunto: Re: Events being cut by flume

The default value for the available memory specified in $FLUME_HOME/bin/flume-ng is very small (20MB)

So, in your $FLUME_HOME/conf/flume-env.sh file

Try increasing your Java memory to a higher number (at most 50% of the available RAM)
JAVA_OPTS="-Xms4096m -Xmx4096m -XX:MaxPermSize=4096m"

Then, in your agent configuration file:

Increase the maximum number of lines per event to a much higher number (like 5000).

Also change the output encoding to UTF-8

Let's make sure that the input encoding matches the encoding of the original event. This can cause problems if it is not the right one.

Let's see if these changes make a difference.
Author and Instructor for the Upcoming Book and Lecture Series
Massive Log Data Aggregation, Processing, Searching and Visualization with Open Source Software
http://massivelogdata.com
On 27 August 2013 11:13, ZORAIDA HIDALGO SANCHEZ <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>> wrote:
Hi Israel,

thanks for your response. We already checked this, doing :set list with vi editor our events look like this:

"line1field1";"line1field2";"line1fieldN"$
"lineNfield1";"lineNfield2";"lineNfieldN"$

There are not event delimiters($) between fields of an event.
I have tried forcing the encoding(because I believe this files, that are generated by our customer, are converted from ascii to utf-8 by BOM and they could contain characters with more bytes that the expected one):

agent.sources.rpb.in