Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume >> mail # user >> Writing to HDFS from multiple HDFS agents (separate machines)


Copy link to this message
-
Writing to HDFS from multiple HDFS agents (separate machines)
Hi guys,

I'm new to flume (hdfs for that metter), using the version packaged with
CDH4 (1.3.0) and was wondering how others are maintaining different file
names being written to per HDFS sink.

My initial thought is to create a separate sub-directory in hdfs for each
sink - though I feel like the better way is to somehow prefix each file
with a unique sink id.  Are there any patterns that others are following
for this?

-Gary
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB