Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Kafka >> mail # user >> One 0.72 ConsumerConnector, multiple threads, 1 blocks. What happens?


Copy link to this message
-
Re: One 0.72 ConsumerConnector, multiple threads, 1 blocks. What happens?
Jun - thanks again. This is very helpful.

Philip

On Jun 12, 2013, at 9:50 PM, Jun Rao <[EMAIL PROTECTED]> wrote:

> Actually, you are right. This can happen on a single topic too, if you have
> more than one consumer thread. Each consumer thread pulls data from a
> blocking queue, one or more fetchers are putting data into the queue. Say,
> you have two consumer threads and two partitions from the same broker.
> There is a single fetcher that fetches both partitions and it will put one
> partition's data into a separate queue. So, if one thread stops consuming
> data, it's queue will be full at some point. This will block the fetcher
> from putting the data into the other queue.
>
> Thanks,
>
> Jun
>
>
> On Wed, Jun 12, 2013 at 9:10 PM, Philip O'Toole <[EMAIL PROTECTED]> wrote:
>
>> Jun -- thanks.
>>
>> But if the topic is the same, doesn't each thread get a partition?
>> Isn't that how it works?
>>
>> Philip
>>
>> On Wed, Jun 12, 2013 at 9:08 PM, Jun Rao <[EMAIL PROTECTED]> wrote:
>>> Yes, when the consumer is consuming multiple topics, if one thread stops
>>> consuming topic 1, it can prevent new data getting into the consumer for
>>> topic 2.
>>>
>>> Thanks,
>>>
>>> Jun
>>>
>>>
>>> On Wed, Jun 12, 2013 at 7:43 PM, Philip O'Toole <[EMAIL PROTECTED]>
>> wrote:
>>>
>>>> Hello -- we're using 0.72. We're looking at the source, but want to be
>>>> sure. :-)
>>>>
>>>> We create a single ConsumerConnector, call createMessageStreams, and
>>>> hand the streams off to individual threads. If one of those threads
>>>> calls next() on a stream, gets some messages, and then *blocks* in
>>>> some subsequent operation (and blocks for minutes), can it potentially
>>>> cause all other threads (calling next() on other streams) to block
>>>> too? Does something inside the ConsumerConnector block all other
>>>> stream processing? This would explain some behaviour we're seeing.
>>>>
>>>> Thanks,
>>>>
>>>> Philip
>>