Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Flume >> mail # user >> Writing to HDFS from multiple HDFS agents (separate machines)


+
Gary Malouf 2013-03-14, 21:54
Copy link to this message
-
Re: Writing to HDFS from multiple HDFS agents (separate machines)
Hello sir,

    One idea could be to create the sub directories with the machines'
hostnames, in case you are getting data from multiple sources. you can
easily find out which data belongs to which machine then.

Warm Regards,
Tariq
https://mtariq.jux.com/
cloudfront.blogspot.com
On Fri, Mar 15, 2013 at 3:24 AM, Gary Malouf <[EMAIL PROTECTED]> wrote:

> Hi guys,
>
> I'm new to flume (hdfs for that metter), using the version packaged with
> CDH4 (1.3.0) and was wondering how others are maintaining different file
> names being written to per HDFS sink.
>
> My initial thought is to create a separate sub-directory in hdfs for each
> sink - though I feel like the better way is to somehow prefix each file
> with a unique sink id.  Are there any patterns that others are following
> for this?
>
> -Gary
>
+
Seshu V 2013-03-15, 21:20
+
Paul Chavez 2013-03-14, 22:31
+
Gary Malouf 2013-03-14, 22:34
+
Mike Percy 2013-03-15, 01:46
+
Gary Malouf 2013-03-15, 02:30
+
Gary Malouf 2013-03-15, 02:42
+
Mike Percy 2013-03-15, 20:43
+
Paul Chavez 2013-03-15, 03:30
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB