Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume >> mail # user >> Uncaught Exception When Using Spooling Directory Source


Copy link to this message
-
Re: Uncaught Exception When Using Spooling Directory Source
Thank you very much! I clean all the related dir and restart again. I keep
the source spooling dir empty, then start Flume, and then put some file
into the spooling dir. But this time a new error occured:

13/01/18 13:44:24 INFO avro.SpoolingFileLineReader: Preparing to move file
/disk2/mahy/FLUME_TEST/source/sspstat.log.20130118112700-20130118112800.hs016.ssp
to /disk2/mahy/FLUME_TEST/
source/sspstat.log.20130118112700-20130118112800.hs016.ssp.COMPLETED
13/01/18 13:44:24 ERROR source.SpoolDirectorySource: Uncaught exception in
Runnable
java.lang.IllegalStateException: File has changed size since being read:
/disk2/mahy/FLUME_TEST/source/sspstat.log.20130118112700-20130118112800.hs016.ssp
        at
org.apache.flume.client.avro.SpoolingFileLineReader.retireCurrentFile(SpoolingFileLineReader.java:241)
        at
org.apache.flume.client.avro.SpoolingFileLineReader.readLines(SpoolingFileLineReader.java:185)
        at
org.apache.flume.source.SpoolDirectorySource$SpoolDirectoryRunnable.run(SpoolDirectorySource.java:135)
        at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
        at
java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:662)
13/01/18 13:44:24 ERROR source.SpoolDirectorySource: Uncaught exception in
Runnable
java.io.IOException: Stream closed
        at java.io.BufferedReader.ensureOpen(BufferedReader.java:97)
        at java.io.BufferedReader.readLine(BufferedReader.java:292)
        at java.io.BufferedReader.readLine(BufferedReader.java:362)
        at
org.apache.flume.client.avro.SpoolingFileLineReader.readLines(SpoolingFileLineReader.java:180)
        at
org.apache.flume.source.SpoolDirectorySource$SpoolDirectoryRunnable.run(SpoolDirectorySource.java:135)
        at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
        at
java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317)
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150)
        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98)
        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180)
        at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:662)
13/01/18 13:44:25 ERROR source.SpoolDirectorySource: Uncaught exception in
Runnable
java.io.IOException: Stream closed
        at java.io.BufferedReader.ensureOpen(BufferedReader.java:97)
I think it is a typical scenario: Flume is watching some dirs and
collecting new arriving files. I don't know why the exception " File has
changed size since being read" was throwed and how to avoid it. Can you
give some advice and guide? Thanks!
On Fri, Jan 18, 2013 at 1:48 PM, Patrick Wendell <[EMAIL PROTECTED]> wrote:

> Hey Henry,
>
> The Spooling source assumes that each file is uniquely named. If it
> sees that new file name has arrived that it already processed (and has
> rolled over to a COMPLETED file), it throws an error and shuts down.

Henry Ma
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB