Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume >> mail # user >> flume to collect log files from various sources


Copy link to this message
-
Re: flume to collect log files from various sources
sorry for late reply
http://archive.cloudera.com/cdh/3/flume/UserGuide/
check out this u will get all details about how to configure. And i never
tried  log4jappender.
On Thu, Jan 24, 2013 at 10:18 PM, yogi nerella <[EMAIL PROTECTED]>wrote:

> Hello Ramya,
>
> Flume agent have to be installed on every host?
>     I want to avoid this by integrating log4jappender into my app, and to
> remove one additional overhead for administrators.
>     Is this a problem?
>
>
> Collecting data using "taildir" command?
>    1.   If I have one flume agent sending log files from one host to
> another agent who is listening.
>     2.   How do I configure the receiving agent such that, it can separate
> the log files again and write as it recieved.
>
>
> Any sample configuration files for either 'taildir'  or 'multitail'
> commands?
>
> Issue with log4jappender:
>
> I am evaluating log4jappender, and using this I am not able to send events
> to the flume agent.
>
> When I turn on debugging all the log messages I see are as follows.
>
> Classpath:
> C:\ServiceManager\workspace\mongodb\Mongodb\bin;C:\apache\apache-flume-1.3.1-bin\lib\avro-1.7.2.jar;C:\apache\apache-flume-1.3.1-bin\lib\avro-ipc-1.7.2.jar;C:\apache\apache-flume-1.3.1-bin\lib\flume-ng-core-1.3.1.jar;C:\apache\apache-flume-1.3.1-bin\lib\flume-ng-log4jappender-1.3.1.jar;C:\apache\apache-flume-1.3.1-bin\lib\flume-ng-sdk-1.3.1.jar;C:\apache\apache-flume-1.3.1-bin\lib\jackson-core-asl-1.9.3.jar;C:\apache\apache-flume-1.3.1-bin\lib\jackson-mapper-asl-1.9.3.jar;C:\apache\apache-flume-1.3.1-bin\lib\netty-3.4.0.Final.jar;C:\apache\apache-flume-1.3.1-bin\lib\slf4j-api-1.6.1.jar;C:\apache\apache-flume-1.3.1-bin\lib\slf4j-log4j12-1.6.1.jar;C:\apache\apache-flume-1.3.1-bin\lib\log4j-1.2.16.jar
> 14:27:29,600 DEBUG NettyAvroRpcClient:420 - Batch size string = 0
> 14:27:29,604  WARN NettyAvroRpcClient:426 - Invalid value for batchSize:
> 0; Using default value.
> 14:27:29,649 DEBUG NettyTransceiver:195 - Using Netty bootstrap options:
> {tcpNoDelay=true, connectTimeoutMillis=20000}
> 14:27:29,649 DEBUG NettyTransceiver:252 - Connecting to /
> 16.90.218.66:44444
> 14:27:29,675 DEBUG NettyTransceiver:491 - [id: 0x01480773] OPEN
> 14:27:29,699 DEBUG NettyTransceiver:491 - [id: 0x01480773, /
> 15.80.67.94:61452 => /16.90.218.66:44444] BOUND: /15.80.67.94:61452
> log4j:ERROR Flume append() failed.
> 14:27:29,940  INFO LogTest:13 - main started at Wed Jan 23 14:27:29 PST
> 2013
> 14:27:54,501 DEBUG NettyTransceiver:314 - Disconnecting from
> 16.90.218.66/16.90.218.66:44444
> 14:27:54,501 DEBUG NettyTransceiver:314 - Disconnecting from
> 16.90.218.66/16.90.218.66:44444
> 14:27:54,502 DEBUG NettyTransceiver:314 - Disconnecting from
> 16.90.218.66/16.90.218.66:44444
> 14:27:54,502 DEBUG NettyTransceiver:314 - Disconnecting from
> 16.90.218.66/16.90.218.66:44444
> 14:27:54,502 DEBUG NettyTransceiver:314 - Disconnecting from
> 16.90.218.66/16.90.218.66:44444
>
>
> Yogi
>
>
>
>
>
> On Wed, Jan 23, 2013 at 8:57 PM, Sri Ramya <[EMAIL PROTECTED]> wrote:
>
>> there is an error displayed on console Flume append() failed. Try to
>> resolve .
>> explain me what is your exact problem.
>>
>> Note:
>> 1, Flume agent have to install in every host from where u want to collect
>> data.
>> 2, If u want to collect data from a directory then you have to to use
>> 'taildir' command.
>> 3, If u want to collect data from different files u have to use
>> 'multitail' command.
>>
>> I think this info will be use full for u.
>> thank you
>>
>>
>>
>> On Thu, Jan 24, 2013 at 4:06 AM, yogi nerella <[EMAIL PROTECTED]>wrote:
>>
>>> Hi,
>>>
>>> I am a new bee for flume, would like to get some feedback on if what I
>>> am doing works correctly.
>>>
>>>
>>> My application runs on multiple hosts, I want to collect all log files
>>> to a central location.
>>>
>>> 1.  In My application I will ship all relevant log4jappender jar files,
>>> and configure the agent's host, port information.
>>>
>>> 2. I will run a simple agent, and configure a source (avro), channel
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB