Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume >> mail # user >> Flume Sink Cleanup


Copy link to this message
-
Flume Sink Cleanup
Hi

This is a newbie question.

I have configured flume sink with the Rolling File Sink

I have the following in my flume-conf.properties page

agent1.sinks.purepath.type = com.x.diagnostics.flume.RollingFileSink
# once an hour
agent1.sinks.purepath.sink.rollInterval = 3600
# Force cutoff at 100 MB
agent1.sinks.purepath.sink.rollSize = 100
agent1.sinks.purepath.sink.directory = /log/export
agent1.sinks.purepath.batchSize = 1000

The directory fills up and is there a way to clean it up. Do I have to
write an external cron job to clean up the directory, or is there a
way to overwrite the files (recycle) after a specified period - say 6
months

Thanks, Rajesh
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB