Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Kafka >> mail # user >> Lots of Exceptions showing up in logs


Copy link to this message
-
Lots of Exceptions showing up in logs
Hi,

Can anyone help me debug whats going on to cause these exceptions ?
1. Gettings lots of these IllegalArgumentExceptions (and there were a few
other nio.Buffer related exceptions in our logs too)

19:22:34,597 WARN  [kafka.consumer.ConsumerFetcherThread]
(ConsumerFetcherThread-hbaseApeConsumer_ape-aux109.dc.farecompare.com-1380666397017-66bf5430-0-2)
[ConsumerFetcherThread-hbaseApeConsumer_ape-aux109.dc.farecompare.com-1380666397017-66bf5430-0-2],
Error in fetch Name: FetchRequest; Version: 0; CorrelationId: 15; ClientId:
APEConsumer-ConsumerFetcherThread-hbaseApeConsumer_ape-aux109.dc.farecompare.com-1380666397017-66bf5430-0-2;
ReplicaId: -1; MaxWait: 400 ms; MinBytes: 1000 bytes; RequestInfo: [APE,1]
-> PartitionFetchInfo(120858,500000000),[APE,5] ->
PartitionFetchInfo(120858,500000000),[APE,9] ->
PartitionFetchInfo(120836,500000000),[APE,2] ->
PartitionFetchInfo(120858,500000000),[APE,11] ->
PartitionFetchInfo(120836,500000000),[APE,6] ->
PartitionFetchInfo(460329,500000000),[APE,0] ->
PartitionFetchInfo(124608,500000000),[APE,7] ->
PartitionFetchInfo(126691,500000000),[APE,3] ->
PartitionFetchInfo(127107,500000000),[APE,4] ->
PartitionFetchInfo(127107,500000000),[APE,10] ->
PartitionFetchInfo(2056250,500000000),[APE,8] ->
PartitionFetchInfo(469501,500000000): java.lang.IllegalArgumentException
        at java.nio.Buffer.limit(Buffer.java:267) [rt.jar:1.7.0_25]
        at
kafka.api.FetchResponsePartitionData$.readFrom(FetchResponse.scala:33)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at kafka.api.TopicData$$anonfun$1.apply(FetchResponse.scala:87)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at kafka.api.TopicData$$anonfun$1.apply(FetchResponse.scala:85)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
[scala-library-2.8.0.jar:]
        at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:206)
[scala-library-2.8.0.jar:]
        at
scala.collection.immutable.Range$ByOne$class.foreach(Range.scala:282)
[scala-library-2.8.0.jar:]
        at
scala.collection.immutable.Range$$anon$1.foreach(Range.scala:274)
[scala-library-2.8.0.jar:]
        at
scala.collection.TraversableLike$class.map(TraversableLike.scala:206)
[scala-library-2.8.0.jar:]
        at scala.collection.immutable.Range.map(Range.scala:39)
[scala-library-2.8.0.jar:]
        at kafka.api.TopicData$.readFrom(FetchResponse.scala:85)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at
kafka.api.FetchResponse$$anonfun$3.apply(FetchResponse.scala:146)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at
kafka.api.FetchResponse$$anonfun$3.apply(FetchResponse.scala:145)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:227)
[scala-library-2.8.0.jar:]
        at
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:227)
[scala-library-2.8.0.jar:]
        at
scala.collection.immutable.Range$ByOne$class.foreach(Range.scala:285)
[scala-library-2.8.0.jar:]
        at
scala.collection.immutable.Range$$anon$1.foreach(Range.scala:274)
[scala-library-2.8.0.jar:]
        at
scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:227)
[scala-library-2.8.0.jar:]
        at scala.collection.immutable.Range.flatMap(Range.scala:39)
[scala-library-2.8.0.jar:]
        at kafka.api.FetchResponse$.readFrom(FetchResponse.scala:145)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at kafka.consumer.SimpleConsumer.fetch(SimpleConsumer.scala:113)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at
kafka.server.AbstractFetcherThread.processFetchRequest(AbstractFetcherThread.scala:96)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at
kafka.server.AbstractFetcherThread.doWork(AbstractFetcherThread.scala:88)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at kafka.utils.ShutdownableThread.run(ShutdownableThread.scala:51)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]

2. We're running on a 10Gb network, with a couple of pretty beefy broker
boxes - so i dont understand why we would get a lot of
SocketTimeoutExceptions

19:27:28,900 INFO  [kafka.consumer.SimpleConsumer]
(ConsumerFetcherThread-hbaseApeConsumer_ape-aux109.dc.farecompare.com-1380666397017-66bf5430-0-2)
Reconnect due to socket error: : java.net.SocketTimeoutException
        at
sun.nio.ch.SocketAdaptor$SocketInputStream.read(SocketAdaptor.java:226)
[rt.jar:1.7.0_25]
        at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:103)
[rt.jar:1.7.0_25]
        at
java.nio.channels.Channels$ReadableByteChannelImpl.read(Channels.java:385)
[rt.jar:1.7.0_25]
        at kafka.utils.Utils$.read(Utils.scala:394)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at
kafka.network.BoundedByteBufferReceive.readFrom(BoundedByteBufferReceive.scala:67)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at
kafka.network.Receive$class.readCompletely(Transmission.scala:56)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at
kafka.network.BoundedByteBufferReceive.readCompletely(BoundedByteBufferReceive.scala:29)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at kafka.network.BlockingChannel.receive(BlockingChannel.scala:100)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at
kafka.consumer.SimpleConsumer.liftedTree1$1(SimpleConsumer.scala:73)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at
kafka.consumer.SimpleConsumer.kafka$consumer$SimpleConsumer$$sendRequest(SimpleConsumer.scala:71)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at
kafka.consumer.SimpleConsumer$$anonfun$fetch$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(SimpleConsumer.scala:110)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at
kafka.consumer.SimpleConsumer$$anonfun$fetch$1$$anonfun$apply$mcV$sp$1.apply(SimpleConsumer.scala:110)
[core-kafka-0.8.0-beta1.jar:0.8.0-beta1]
        at
kafka.consumer.SimpleConsumer$$anonfun$fetch$1$$anonfun$apply$mcV$sp$1.apply(SimpleConsumer.scala:110)
[core-kafka-0.8.0-beta
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB