Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Kafka >> mail # user >> Re: Exception in Kafka Broker when consumer shuts down/crashes.


Copy link to this message
-
Re: Exception in Kafka Broker when consumer shuts down/crashes.
Just curious, what was the problem ? :)

Thanks,
Neha
On Fri, Jan 4, 2013 at 1:23 PM, Subhash Agrawal <[EMAIL PROTECTED]>wrote:

> Thanks Neha. I figured out the problem. Consumer is still picking up
> messages without broker restart.
>
> -----Original Message-----
> From: Neha Narkhede [mailto:[EMAIL PROTECTED]]
> Sent: Friday, January 04, 2013 1:01 PM
> To: [EMAIL PROTECTED]
> Subject: Re: Exception in Kafka Broker when consumer shuts down/crashes.
>
> Hi,
>
> Few questions regarding your test/setup -
> 1. Which version of Kafka are you using ?
> 2. Are you using the high level consumer or SimpleConsumer ?
> 3. Can you describe your setup a little more ? Are you continuously sending
> messages to that broker, then shutdown consumer and stop receiving data ?
>
> You shouldn't have to restart the broker when you kill the consumer, so
> maybe something else is wrong here.
>
> Thanks,
> Neha
>
>
> On Fri, Jan 4, 2013 at 11:58 AM, Subhash Agrawal <[EMAIL PROTECTED]
> >wrote:
>
> > Hi,
> >
> > I noticed that when I kill consumer, I see exception in kafka broker and
> I
> > need to restart kafka broker server to get messages again.
> > Do I need to tune anything to avoid this error or to avoid restarting
> > kafka broker?
> >
> > Thanks
> > Subhash Agrawal
> >
> > Here is the exception:
> >
> > [2013-01-05 11:12:59,361] INFO Closing socket connection to /127.0.0.1.
> > (kafka.n
> > etwork.Processor)
> > [2013-01-04 11:45:00,456] ERROR Closing socket for /127.0.0.1 because of
> > error (
> > kafka.network.Processor)
> > java.io.IOException: An existing connection was forcibly closed by the
> > remote ho
> > st
> >         at sun.nio.ch.SocketDispatcher.read0(Native Method)
> >         at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:25)
> >         at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:198)
> >         at sun.nio.ch.IOUtil.read(IOUtil.java:171)
> >         at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:245)
> >         at kafka.utils.Utils$.read(Utils.scala:538)
> >         at
> > kafka.network.BoundedByteBufferReceive.readFrom(BoundedByteBufferRece
> > ive.scala:54)
> >         at kafka.network.Processor.read(SocketServer.scala:311)
> >         at kafka.network.Processor.run(SocketServer.scala:214)
> >         at java.lang.Thread.run(Thread.java:662)
> >
>

 
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB