Abhijeet Shipure 2013-08-27, 09:59
Roshan Naik 2013-08-27, 18:36
-Re: Fail over for flume source
Abhijeet Shipure 2013-08-29, 15:28
Thanks for the reply. However I think "Failover Sink processor" is not
solving the issue. My problem is how to manage failover for the Source ?
Like in this case if server on which Flume Source is present is crashed,
Router does not know where to send syslogs.
I was think if something like Red Hat CLustering could be used ? What do
you think ?
On Wed, Aug 28, 2013 at 12:06 AM, Roshan Naik <[EMAIL PROTECTED]>wrote:
> Take a look at the 'Flume Sink Processor' section in the User Guide. The
> subsection 'Failover Sink Processor' would interest you.
> On Tue, Aug 27, 2013 at 2:59 AM, Abhijeet Shipure <[EMAIL PROTECTED]>wrote:
>> I am working on a hadoop based log archival solution where logs are
>> generated by telecom Routers.
>> I have very limited knowledge of the routers but I know that routers will
>> send syslogs to given IP address and port.
>> If we want to use Flume for collecting these syslog, we will run flume
>> agent having Syslog UDP as source and running on the same server where
>> router will send logs.
>> Assuming router can send log messages to only single IP, how can we
>> achieve fail-over for flume agent in such cases ?
>> This is something urgent and any help would be greatly appreciated.
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
Roshan Naik 2013-10-15, 18:21