Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Flume, mail # user - flume to collect log files from various sources


+
yogi nerella 2013-01-23, 22:36
+
Sri Ramya 2013-01-24, 04:57
+
yogi nerella 2013-01-24, 16:48
+
Sri Ramya 2013-01-28, 06:35
Copy link to this message
-
Re: flume to collect log files from various sources
yogi nerella 2013-01-28, 16:29
Hello Ramya,

The above link shows example commands to configure master, node etc.,
But I donot see any "flume" binary?  or "config" binary?
or flume_site.xml file in OOB flume-1.3.1-bin or flume-1.4.0-bin.
Am I missing something?
How do I set my agent's as nodes, and my collector's as masters etc?
Thanks,
Yogi
On Sun, Jan 27, 2013 at 10:35 PM, Sri Ramya <[EMAIL PROTECTED]> wrote:

> sorry for late reply
> http://archive.cloudera.com/cdh/3/flume/UserGuide/
> check out this u will get all details about how to configure. And i never
> tried  log4jappender.
>
>
> On Thu, Jan 24, 2013 at 10:18 PM, yogi nerella <[EMAIL PROTECTED]>wrote:
>
>> Hello Ramya,
>>
>> Flume agent have to be installed on every host?
>>     I want to avoid this by integrating log4jappender into my app, and to
>> remove one additional overhead for administrators.
>>     Is this a problem?
>>
>>
>> Collecting data using "taildir" command?
>>    1.   If I have one flume agent sending log files from one host to
>> another agent who is listening.
>>     2.   How do I configure the receiving agent such that, it can
>> separate the log files again and write as it recieved.
>>
>>
>> Any sample configuration files for either 'taildir'  or 'multitail'
>> commands?
>>
>> Issue with log4jappender:
>>
>> I am evaluating log4jappender, and using this I am not able to send
>> events to the flume agent.
>>
>> When I turn on debugging all the log messages I see are as follows.
>>
>> Classpath:
>> C:\ServiceManager\workspace\mongodb\Mongodb\bin;C:\apache\apache-flume-1.3.1-bin\lib\avro-1.7.2.jar;C:\apache\apache-flume-1.3.1-bin\lib\avro-ipc-1.7.2.jar;C:\apache\apache-flume-1.3.1-bin\lib\flume-ng-core-1.3.1.jar;C:\apache\apache-flume-1.3.1-bin\lib\flume-ng-log4jappender-1.3.1.jar;C:\apache\apache-flume-1.3.1-bin\lib\flume-ng-sdk-1.3.1.jar;C:\apache\apache-flume-1.3.1-bin\lib\jackson-core-asl-1.9.3.jar;C:\apache\apache-flume-1.3.1-bin\lib\jackson-mapper-asl-1.9.3.jar;C:\apache\apache-flume-1.3.1-bin\lib\netty-3.4.0.Final.jar;C:\apache\apache-flume-1.3.1-bin\lib\slf4j-api-1.6.1.jar;C:\apache\apache-flume-1.3.1-bin\lib\slf4j-log4j12-1.6.1.jar;C:\apache\apache-flume-1.3.1-bin\lib\log4j-1.2.16.jar
>> 14:27:29,600 DEBUG NettyAvroRpcClient:420 - Batch size string = 0
>> 14:27:29,604  WARN NettyAvroRpcClient:426 - Invalid value for batchSize:
>> 0; Using default value.
>> 14:27:29,649 DEBUG NettyTransceiver:195 - Using Netty bootstrap options:
>> {tcpNoDelay=true, connectTimeoutMillis=20000}
>> 14:27:29,649 DEBUG NettyTransceiver:252 - Connecting to /
>> 16.90.218.66:44444
>> 14:27:29,675 DEBUG NettyTransceiver:491 - [id: 0x01480773] OPEN
>> 14:27:29,699 DEBUG NettyTransceiver:491 - [id: 0x01480773, /
>> 15.80.67.94:61452 => /16.90.218.66:44444] BOUND: /15.80.67.94:61452
>> log4j:ERROR Flume append() failed.
>> 14:27:29,940  INFO LogTest:13 - main started at Wed Jan 23 14:27:29 PST
>> 2013
>> 14:27:54,501 DEBUG NettyTransceiver:314 - Disconnecting from
>> 16.90.218.66/16.90.218.66:44444
>> 14:27:54,501 DEBUG NettyTransceiver:314 - Disconnecting from
>> 16.90.218.66/16.90.218.66:44444
>> 14:27:54,502 DEBUG NettyTransceiver:314 - Disconnecting from
>> 16.90.218.66/16.90.218.66:44444
>> 14:27:54,502 DEBUG NettyTransceiver:314 - Disconnecting from
>> 16.90.218.66/16.90.218.66:44444
>> 14:27:54,502 DEBUG NettyTransceiver:314 - Disconnecting from
>> 16.90.218.66/16.90.218.66:44444
>>
>>
>> Yogi
>>
>>
>>
>>
>>
>> On Wed, Jan 23, 2013 at 8:57 PM, Sri Ramya <[EMAIL PROTECTED]> wrote:
>>
>>> there is an error displayed on console Flume append() failed. Try to
>>> resolve .
>>> explain me what is your exact problem.
>>>
>>> Note:
>>> 1, Flume agent have to install in every host from where u want to
>>> collect data.
>>> 2, If u want to collect data from a directory then you have to to use
>>> 'taildir' command.
>>> 3, If u want to collect data from different files u have to use
>>> 'multitail' command.
>>>
>>> I think this info will be use full for u.
>>> thank you
>>>
>>>
>>>
+
Sri Ramya 2013-01-29, 03:40
+
Alexander Alten-Lorenz 2013-01-29, 07:37