Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume, mail # user - Exceptions after reloading configuration


Copy link to this message
-
Re: Exceptions after reloading configuration
Brock Noland 2013-01-17, 02:53
Agreed in 1.2 re-configuration was certainly buggy! :) and RollingFileSink doubly so.

--
Brock Noland
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
On Wednesday, January 16, 2013 at 6:36 PM, Juhani Connolly wrote:

> I hadn't noticed that patch, looks useful! Was mainly referring to 1.2
> where hot reconfiguration certainly wasn't reliable.
>
> So I guess Sam is looking at either upgrading(recommended since 1.3.1
> has lots of other goodness) and/or doing full restart on reconfiguration.
>
> On 01/17/2013 11:18 AM, Brock Noland wrote:
> > FLUME-1630 (in flume 1.3.1) hopefully improved re-configuration, but
> > in general I agree, restart is better.
> >
> > On Wed, Jan 16, 2013 at 6:11 PM, Juhani Connolly
> > <[EMAIL PROTECTED]> wrote:
> > > Switching configuration on a running node is pretty buggy, I would recommend
> > > just restarting flume, as while sometimes it will work there are issues like
> > > components not getting properly shut down even when removed from the config.
> > >
> > >
> > > On 01/17/2013 05:41 AM, Yatchmenoff, Sam wrote:
> > >
> > > I have Flume 1.2.0 running in a production system with 3 collectors fed by
> > > ~30 agents running on our application servers. If I make a change to the
> > > node configuration on the collectors, when the configuration is reloaded
> > > automatically, the collectors will occasionally fail and repeatedly report
> > > the following exception:
> > >
> > > 2013-01-16 20:31:22,353 ERROR flume.SinkRunner: Unable to deliver event.
> > > Exception follows.
> > > org.apache.flume.EventDeliveryException: Failed to process transaction
> > > at org.apache.flume.sink.RollingFileSink.process(RollingFileSink.java:218)
> > > at
> > > org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68)
> > > at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147)
> > > at java.lang.Thread.run(Thread.java:679)
> > > Caused by: java.io.IOException: Stream Closed
> > > at java.io.FileOutputStream.writeBytes(Native Method)
> > > at java.io.FileOutputStream.write(FileOutputStream.java:297)
> > > at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
> > > at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)
> > > at java.io.FilterOutputStream.write(FilterOutputStream.java:97)
> > > at
> > > org.apache.flume.serialization.BodyTextEventSerializer.write(BodyTextEventSerializer.java:71)
> > > at org.apache.flume.sink.RollingFileSink.process(RollingFileSink.java:195)
> > > ... 3 more
> > >
> > > After about a dozen of those, I will start seeing this exception:
> > >
> > > 2013-01-16 20:32:27,374 ERROR flume.SinkRunner: Unable to deliver event.
> > > Exception follows.
> > > org.apache.flume.EventDeliveryException: Unable to rotate file
> > > /mnt/rawlog/1358365369665-49 while delivering event
> > > at org.apache.flume.sink.RollingFileSink.process(RollingFileSink.java:155)
> > > at
> > > org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68)
> > > at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147)
> > > at java.lang.Thread.run(Thread.java:679)
> > > Caused by: java.io.IOException: Stream Closed
> > > at java.io.FileOutputStream.writeBytes(Native Method)
> > > at java.io.FileOutputStream.write(FileOutputStream.java:297)
> > > at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
> > > at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
> > > at org.apache.flume.sink.RollingFileSink.process(RollingFileSink.java:149)
> > > ... 3 more
> > >
> > >
> > > Here is configuration for that agent:
> > >
> > > agent1.sources = source1 source2
> > > agent1.sinks = sink1
> > > agent1.channels = channel1
> > >
> > > # Describe/configure source1
> > > agent1.sources.source1.type = avro
> > > agent1.sources.source1.bind = 0.0.0.0
> > > agent1.sources.source1.port = 35853
> > >
> > > agent1.sources.source2.type = netcat
> > > agent1.sources.source2.bind = localhost