Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume, mail # user - how to print the channel capacity


Copy link to this message
-
Re: 答复: how to print the channel capacity
Nitin Pawar 2013-05-15, 11:48
instead of memory channel .. can you try file channel?

i think when you say exact point that can balance input and output .. you
want to figure out how many events can the memory channel buffer before you
start losing the events .. is that correct ?

from http://flume.apache.org/FlumeUserGuide.html#memory-channel
capacity100The max number of events stored in the channeltransactionCapacity
100The max number of events stored in the channel per
transactionkeep-alive3Timeout
in seconds for adding or removing an event
On Wed, May 15, 2013 at 5:09 PM, liuyongbo <[EMAIL PROTECTED]> wrote:

>  Thanks for your answer.****
>
> Additional,I’m using mem channel, write log to mongodb, when the input log
> is faster than consume(write into mongo), the queue is growing, when reach
> the max,the new input log is lost.****
>
> So, what I want to know is the exact point that can blance the input and
> output****
>
> ** **
>
> *发件人:* Nitin Pawar [mailto:[EMAIL PROTECTED]]
> *发送时间:* 2013年5月15日 16:49
> *收件人:* [EMAIL PROTECTED]
> *主题:* Re: how to print the channel capacity****
>
> ** **
>
> here is one example for the capacity defining flow ****
>
> https://cwiki.apache.org/FLUME/flume-ng-performance-measurements.html****
>
> ** **
>
> On Wed, May 15, 2013 at 2:16 PM, Nitin Pawar <[EMAIL PROTECTED]>
> wrote:****
>
> sorry pressed enter too soon ****
>
> ** **
>
> as for your question: how many events a flume agent can hold? ****
>
> sorry but I don't think there is any direct answer to that.... .I may be
> very well wrong there as I am myself pretty new with flume ****
>
> ** **
>
> there was a JIRA for the capacity of file channels FLUME-1571****
>
> ** **
>
> On Wed, May 15, 2013 at 1:50 PM, Nitin Pawar <[EMAIL PROTECTED]>
> wrote:****
>
> for maximum performance on your data flow two things which will matter
> most are: the channel and the transaction batch size.****
>
> when you say losing data, are you using memory channel? or file channel? *
> ***
>
> ** **
>
> Flume can batch events. The batch size is the maximum number of events
> that a sink or client will attempt to take from a channel in a single
> transaction.****
>
> ** **
>
>  What is the channel type****
>
> do you have a slow sink so the # events written out are less than # event
> incoming to channels so over time it piles up ****
>
> ** **
>
> others may point out more things. ****
>
> Also your flume conf and if you are seeing any errors on flume then that
> will help people to find out the problem ****
>
> ** **
>
> On Wed, May 15, 2013 at 11:07 AM, liuyongbo <[EMAIL PROTECTED]> wrote:**
> **
>
> Hi:****
>
>          I’m using flume to pass log data to mongodb, but I find that some
> data lose when the pressure is in high level, so I want to know the max
> request that flume can hold and need to print the capacity.but I can not
> find the proper way to do this instead of change the source code. Any ideas?
> ****
>
>          thanks****
>
>
>
> ****
>
> ** **
>
> --
> Nitin Pawar****
>
>
>
> ****
>
> ** **
>
> --
> Nitin Pawar****
>
>
>
> ****
>
> ** **
>
> --
> Nitin Pawar****
>

--
Nitin Pawar