Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume, mail # user - flume to collect log files from various sources


Copy link to this message
-
Re: flume to collect log files from various sources
Sri Ramya 2013-01-24, 04:57
there is an error displayed on console Flume append() failed. Try to
resolve .
explain me what is your exact problem.

Note:
1, Flume agent have to install in every host from where u want to collect
data.
2, If u want to collect data from a directory then you have to to use
'taildir' command.
3, If u want to collect data from different files u have to use 'multitail'
command.

I think this info will be use full for u.
thank you

On Thu, Jan 24, 2013 at 4:06 AM, yogi nerella <[EMAIL PROTECTED]> wrote:

> Hi,
>
> I am a new bee for flume, would like to get some feedback on if what I am
> doing works correctly.
>
>
> My application runs on multiple hosts, I want to collect all log files to
> a central location.
>
> 1.  In My application I will ship all relevant log4jappender jar files,
> and configure the agent's host, port information.
>
> 2. I will run a simple agent, and configure a source (avro), channel
> (memory), and sink (file_roll).
>
> File_roll will write the events recieved via AVRO source and write them to
> the corresponding file_roll sink.
>
> 3. Do I need one agent for each host, or is there a way I can configure to
> write to multiple files, dynamically based on the real source the message
> is getting recieved.
>
> 4. If I modify the configuration file, will the agent re-read the
> information.   (Ex:  I want to add a new host, and collect the log files to
> a new directory).
>
> 5.  In my simple test, messages sent from my app are getting lost, and I
> see the following information in the console.
>
>
> Classpath:
> C:\ServiceManager\workspace\mongodb\Mongodb\bin;C:\apache\apache-flume-1.3.1-bin\lib\avro-1.7.2.jar;C:\apache\apache-flume-1.3.1-bin\lib\avro-ipc-1.7.2.jar;C:\apache\apache-flume-1.3.1-bin\lib\flume-ng-core-1.3.1.jar;C:\apache\apache-flume-1.3.1-bin\lib\flume-ng-log4jappender-1.3.1.jar;C:\apache\apache-flume-1.3.1-bin\lib\flume-ng-sdk-1.3.1.jar;C:\apache\apache-flume-1.3.1-bin\lib\jackson-core-asl-1.9.3.jar;C:\apache\apache-flume-1.3.1-bin\lib\jackson-mapper-asl-1.9.3.jar;C:\apache\apache-flume-1.3.1-bin\lib\netty-3.4.0.Final.jar;C:\apache\apache-flume-1.3.1-bin\lib\slf4j-api-1.6.1.jar;C:\apache\apache-flume-1.3.1-bin\lib\slf4j-log4j12-1.6.1.jar;C:\apache\apache-flume-1.3.1-bin\lib\log4j-1.2.16.jar
> 14:27:29,600 DEBUG NettyAvroRpcClient:420 - Batch size string = 0
> 14:27:29,604  WARN NettyAvroRpcClient:426 - Invalid value for batchSize:
> 0; Using default value.
> 14:27:29,649 DEBUG NettyTransceiver:195 - Using Netty bootstrap options:
> {tcpNoDelay=true, connectTimeoutMillis=20000}
> 14:27:29,649 DEBUG NettyTransceiver:252 - Connecting to /
> 16.90.218.66:44444
> 14:27:29,675 DEBUG NettyTransceiver:491 - [id: 0x01480773] OPEN
> 14:27:29,699 DEBUG NettyTransceiver:491 - [id: 0x01480773, /
> 15.80.67.94:61452 => /16.90.218.66:44444] BOUND: /15.80.67.94:61452
> log4j:ERROR Flume append() failed.
> 14:27:29,940  INFO LogTest:13 - main started at Wed Jan 23 14:27:29 PST
> 2013
> 14:27:54,501 DEBUG NettyTransceiver:314 - Disconnecting from
> 16.90.218.66/16.90.218.66:44444
> 14:27:54,501 DEBUG NettyTransceiver:314 - Disconnecting from
> 16.90.218.66/16.90.218.66:44444
> 14:27:54,502 DEBUG NettyTransceiver:314 - Disconnecting from
> 16.90.218.66/16.90.218.66:44444
> 14:27:54,502 DEBUG NettyTransceiver:314 - Disconnecting from
> 16.90.218.66/16.90.218.66:44444
> 14:27:54,502 DEBUG NettyTransceiver:314 - Disconnecting from
> 16.90.218.66/16.90.218.66:44444
>
>
> Appreciate your help
> Yogi
>