Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Kafka >> mail # user >> Getting fixed amount of messages using Zookeeper based consumer


Copy link to this message
-
Re: Getting fixed amount of messages using Zookeeper based consumer
The inner loop keeps running. If I break it in the middle, is Kafka broker
going to know that rest of the mesaages in the stream were not delivered?

Regards,
Vaibhav
GumGum
On Jul 11, 2012 5:05 PM, "Vaibhav Puranik" <[EMAIL PROTECTED]> wrote:

> Hi all,
>
> Is there any way to get a fixed amount of messages using Zookeeper based
> consumer (ConsumerConnector)?
>
> I know that with SimpleConsumer you can pass fetchSize as an argument and
> limit the number of messages coming back.
>
> This sample code creates 4 threads that keep consuming forever.
>
>
> // consume the messages in the threads
> for(final KafkaStream<Message> stream: streams) {
>   executor.submit(new Runnable() {
>     public void run() {
>       for(MessageAndMetadata msgAndMetadata: stream) {
>         // process message (msgAndMetadata.message())
>       }
>     }
>   });
> }
>
> Regards,
> Vaibhav
>
>
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB