I'v got a problem like this. 1. I use the groupname "GourpA" to consume the kafka topic "topicA" . several days later , we cannot got the new data from the consumer. 2. Then i use the groupname "groupB" to consume the kafa topic "topicA". in this new consumer , i got the new data. and i get the new data from the old consumer.(groupA) at the same time. i don't know what's wrong with it. did i do the wrong things ? thanks
When the consumption stops in GroupA, were there any error/exception in the consumer logic?
One common cause for a consumer to stop is that the application code hits an exception while consuming an event. In particular, if you are using java thread pool and haven't attached an exception handler, those exceptions will be eaten and you won't see them in the application log. The solution is to add a try/catch clause in the application code to log all Throwables.
Jun On Sun, Jan 20, 2013 at 6:22 PM, Bo Sun <[EMAIL PROTECTED]> wrote:
it's not any exceptions. i found sth.today, "several days later " , it's no data to produce to kafka. but after several days . kafka get the new data. but cannot consume anything . is that ( no data days ) couse the problem?
On Mon, Jan 21, 2013 at 2:13 PM, Jun Rao <[EMAIL PROTECTED]> wrote:
Right now, from a stream, an iterator can be obtained which has a blocking hasNext(). So what is the implementation behind the iterator? I assume there must be queue and the iterator monitors the queue. And a separate thread fetches data and feeds to the queue when it is almost empty. If that is the case, no more optimization needs to be done by our users. Please confirm if there is a separate thread that fetches data and feeds to the queue. Thanks. Regards,
NEW: Monitor These Apps!
All projects made searchable here are trademarks of the Apache Software Foundation.
Service operated by Sematext