I try to resolve such behavior:
suppose storm consumes messages from kafka. In case part of it's
consumers crashed for any reasons and as a result didn't succeed to process
the consumed messages. But if it is impossible after recover to reprocess
these messages the system will not be robust and it has data integrity
That is why I try to understand what is the Kafka capabilities. I just
don't know what is the best practice to do it.
May be it is a matter of configuration ?
On Thu, Aug 1, 2013 at 1:31 AM, Milind Parikh <[EMAIL PROTECTED]>wrote:
> It is possible to consume the same message more than once with the same
> consumer. However WHAT you actually do with the message (such as idempotent
> writes) is the tricker part.
> On Wed, Jul 31, 2013 at 8:22 AM, Oleg Ruchovets <[EMAIL PROTECTED]
> > Hi ,
> > I just don't know which mail list is correct to post this question( storm
> > or kafka)? Sorry for cross post.
> > I just read the documentation which describe guaranteed message
> > processing with storm -
> > https://github.com/nathanmarz/storm/wiki/Guaranteeing-message-processing
> > The question actually what will be with the message which was consumed by
> > storm and it is failed to process. In case I'll use anchoring technique ,
> > trying to process the message the second time: will this be available in
> > kafka ( I am using storm-kafka spout)?
> > I mean Is it possible to consume the same message in kafka more then one
> > time with the same consumer?
> > Thanks
> > Oleg.