Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Flume >> mail # user >> multiple agents


+
Ashutoshsharma 2012-11-08, 01:39
+
Alexander Lorenz 2012-11-08, 07:20
+
Ashutoshsharma 2012-11-08, 07:37
+
Juhani Connolly 2012-11-08, 08:07
+
Ashutoshsharma 2012-11-08, 08:18
+
Ashutoshsharma 2012-11-09, 08:43
Copy link to this message
-
Re: multiple agents
I can't see any obvious problem with your config.

When you start up, check your logs if all the components were correctly
configured and started. You may need to adjust the log4j configurations
in your conf directory.

Are all your file channels configured to write to different directories?
If they have the same place setup, things aren't going to work well.

On 11/09/2012 05:43 PM, Ashutoshsharma(오픈플랫폼개발팀) wrote:
>
> Hi,
>
> Can I define the multiple flows with different sources, sinks and
> channels as below:
>
> agent.sources = tx-avro dev-avro web-avro
>
> agent.sinks = tx-es-sink tx-hdfs-sink dev-es-sink dev-hdfs-sink
> web-es-sink web-hdfs-sink
>
> agent.channels = tx-mem-channel tx-file-channel dev-mem-channel
> dev-file-channel web-mem-channel web-file-channel
>
> ##### Flow1 - Start #################################
>
> ## Define Avro source
>
> agent.sources.tx-avro.type = avro
>
> agent.sources.tx-avro.bind = 0.0.0.0
>
> agent.sources.tx-avro.port = 35853
>
> agent.sources.tx-avro.channels = tx-mem-channel tx-file-channel
>
> agent.sources.tx-avro.selector.type = replicating
>
> ## Define HDFS sink
>
> agent.sinks.tx-hdfs-sink.type = hdfs
>
> agent.sinks.tx-hdfs-sink.hdfs.path = hdfs://…/%{hostname}/%Y-%m-%d
>
> agent.sinks.tx-hdfs-sink.hdfs.fileType = DataStream
>
> agent.sinks.tx-hdfs-sink.hdfs.writeFormat = Text
>
> agent.sinks.tx-hdfs-sink.hdfs.filePrefix = transaction
>
> agent.sinks.tx-hdfs-sink.channel = tx-file-channel
>
> agent.sinks.tx-hdfs-sink.hdfs.rollCount = 0
>
> agent.sinks.tx-hdfs-sink.hdfs.rollSize = 0
>
> agent.sinks.tx-hdfs-sink.hdfs.rollInterval = 600
>
> ## Define es sink
>
> agent.sinks.tx-es-sink.type = org.flume.sink.ESSink
>
> agent.sinks.tx-es-sink.indexName = txlog
>
> agent.sinks.tx-es-sink.typeName = tx
>
> agent.sinks.tx-es-sink.cluster = es-cluster
>
> agent.sinks.tx-es-sink.host = 9.127.216.198
>
> agent.sinks.tx-es-sink.channel = tx-mem-channel
>
> ## Define the memory channel
>
> agent.channels.tx-mem-channel.type = memory
>
> agent.channels.tx-mem-channel.capacity = 10000
>
> agent.channels.tx-mem-channel.transactionCapacity = 20
>
> ## Define the file channel
>
> agent.channels.tx-file-channel.type = FILE
>
> agent.channels.tx-file-channel.checkpointDir =
> /flume/agent/tx-file-channel/checkpoint
>
> agent.channels.tx-file-channel.dataDirs =
> /flume/agent/tx-file-channel/data
>
> Same as flow1 #### Flow2 #####....#### Flow3 ######....... is defined
> with different port for avro source. Here I am using flow1, flow2 and
> flow3 for three different types of logs and store separately i.e.
> different location.
>
> When I defined the flume.conf(collector) as mentioned above, agents
> failed to connect to the avro sources. It returns the RPC connection
> error. However, I checked that the agent is able to send the events to
> the collector if I specify only one avro source.
>
> So, the question is, can I define the mentioned configuration to have
> multiple agents(flows) as mentioned above?
>
> ----------------------------------------
>
> Thanks & Regards,
>
> Ashutosh Sharma
>
> ----------------------------------------
>
> *From:*Juhani Connolly [mailto:[EMAIL PROTECTED]]
> *Sent:* Thursday, November 08, 2012 5:07 PM
> *To:* [EMAIL PROTECTED]
> *Subject:* Re: multiple agents
>
> Hi Ashutosh,
>
> as was pointed out, one configuration will work fine.
>
> There is nothing stopping you running multiple background tasks, but
> that won't be possible with the service scripts that come with the
> flume packaged in cdh, you'd have to write your own service scripts.
> But really I can't think of a use case where you would want multiple
> processes
>
> On 11/08/2012 10:39 AM, Ashutoshsharma(오픈플랫폼개발팀) wrote:
>
>     Hi,
>
>     I have sources to collect multiple types of logs(mainly three
>     types). Most of them generate at least two types of logs. That
>     mean, a server generates two types of log. For my use case, I
>     created two separate agents running on a server to collect the
+
Roshan Naik 2012-11-09, 11:46
+
Ashutoshsharma 2012-11-12, 02:05
+
Ashutoshsharma 2012-11-12, 02:07
+
Nitin Pawar 2012-11-09, 08:48
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB