Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Flume >> mail # user >> How to upload the SEQ data into hdfs


+
mardan Khan 2012-07-24, 00:26
+
Brock Noland 2012-07-24, 00:33
+
mardan Khan 2012-07-24, 01:07
+
Will McQueen 2012-07-24, 02:46
+
mardan Khan 2012-07-24, 11:58
+
Brock Noland 2012-07-24, 12:23
+
mardan Khan 2012-07-24, 23:18
+
mardan Khan 2012-07-25, 03:45
Copy link to this message
-
Re: How to upload the SEQ data into hdfs
Hi Mardan,

You can try:
agent2.channels = c1
agent2.sources = r1
agent2.sinks = k1

agent2.channels.c1.type = MEMORY

agent2.sources.r1.channels = c1
agent2.sources.r1.type = SEQ

agent2.sinks.k1.channel = c1
agent2.sinks.k1.type = HDFS
agent2.sinks.k1.hdfs.path = hdfs://134.83.35.24/user/mardan/flume/

Be sure that hadoop libs are installed on the same box as the flume agent.

Cheers,
Will

On Mon, Jul 23, 2012 at 5:26 PM, mardan Khan <[EMAIL PROTECTED]> wrote:

> Hi,
>
> I am just doing testing. I am generating the sequence and want to upload
> into hdfs. My configuration file as:
>
> agent2.channels = c1
> agent2.sources = r1
> agent2.sinks = k1
>
> agent2.channels.c1.type = MEMORY
>
> agent2.sources.r1.channels = c1
> agent2.sources.r1.type = SEQ
>
> agent2.sinks.k1.channel = c1
> agent2.sinks.k1.type = LOGGER
>
>
> Is it possible to upload into hdfs, if possible then how I can make the
> changes in configuration file.
>
>
> Many thanks
>
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB