Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume >> mail # user >> Log4J Appender in Flume


Copy link to this message
-
Re: Log4J Appender in Flume
Yes that's what I wanted to do, but that feedback loop is killer. I'll just
have to switch to something else.

Thanks

- Connor
On Sat, Jan 19, 2013 at 5:25 PM, Brock Noland <[EMAIL PROTECTED]> wrote:

> Hi,
>
> Do I understand this correctly, you are going to use the flume log4j
> appender to collect flume logs?
>
> If so, I don't see how you'd avoid the feedback loop.
>
> Brock
>
> On Fri, Jan 18, 2013 at 11:13 AM, Connor Woodson <[EMAIL PROTECTED]>
> wrote:
> > I just ran into an unfortunate configuration issue; I want to use flume's
> > log4j appender as part of the agent itself in order to send error logs
> to a
> > monitoring system. However, when that system is down (which it should be
> > able to be without causing issues), the RPCClient (even though it's in a
> > Failover Sink) throws an error/warning that it can't connect to the
> system,
> > and then that error gets routed recursively back into the appender and so
> > forth...
> >
> > Is there any way to make this system work (using the log4j appender on
> error
> > messages), or if not any recommendations on how to track my error logs?
> >
> > Thanks,
> >
> > - Connor
>
>
>
> --
> Apache MRUnit - Unit testing MapReduce -
> http://incubator.apache.org/mrunit/
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB