Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Kafka, mail # user - Kafka 08 clients


Copy link to this message
-
Re: Kafka 08 clients
Andrew Otto 2013-08-12, 13:51
This is the Kafka C client for 0.8 we are using at Wikimedia:

  https://github.com/edenhill/librdkafka

If you're using Debian/Ubuntu: you use the debian branch here to build a .deb:

  https://github.com/paravoid/librdkafka/tree/debian

On Aug 12, 2013, at 12:06 AM, Jun Rao <[EMAIL PROTECTED]> wrote:

> At LinkedIn, we built a C producer client for 0.8, we plan to open source
> it in the next few weeks.
>
> Thanks,
>
> Jun
>
>
> On Sat, Aug 10, 2013 at 6:29 PM, Mark <[EMAIL PROTECTED]> wrote:
>
>> Is there an *official* client out there?
>>
>> On Aug 10, 2013, at 4:10 PM, Scott Clasen <[EMAIL PROTECTED]> wrote:
>>
>>> bpot/poseidon on github is a ruby 0.8 client, works fine for me
>>>
>>> Sent from my iPhone
>>>
>>> On Aug 10, 2013, at 3:08 PM, Timothy Chen <[EMAIL PROTECTED]> wrote:
>>>
>>>> That's definitely means it's not up to date to the protocol, I'm tried
>> the
>>>> java client and it was working with latest 0.8 api.
>>>>
>>>> Not sure about any other languages.
>>>>
>>>> Tim
>>>>
>>>>
>>>> On Sat, Aug 10, 2013 at 2:55 PM, Mark <[EMAIL PROTECTED]>
>> wrote:
>>>>
>>>>> Are all Kafka clients working with the latest version of Kafka?
>>>>>
>>>>> I tried the kafka-rb client and a simple example listed in the README
>> but
>>>>> I keep getting a nasty error
>>>>> require 'kafka'
>>>>> producer = Kafka::Producer.new
>>>>> message = Kafka::Message.new("some random message content")
>>>>> producer.push(message)
>>>>>
>>>>> [2013-08-10 14:49:52,166] ERROR Closing socket for /127.0.0.1 because
>> of
>>>>> error (kafka.network.Processor)
>>>>> java.nio.BufferUnderflowException
>>>>>      at java.nio.HeapByteBuffer.get(HeapByteBuffer.java:127)
>>>>>      at java.nio.ByteBuffer.get(ByteBuffer.java:675)
>>>>>      at kafka.api.ApiUtils$.readShortString(ApiUtils.scala:38)
>>>>>      at
>>>>> kafka.api.ProducerRequest$$anonfun$1.apply(ProducerRequest.scala:40)
>>>>>      at
>>>>> kafka.api.ProducerRequest$$anonfun$1.apply(ProducerRequest.scala:38)
>>>>>      at
>>>>>
>> scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:227)
>>>>>      at
>>>>>
>> scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:227)
>>>>>      at
>>>>> scala.collection.immutable.Range$ByOne$class.foreach(Range.scala:282)
>>>>>      at
>>>>> scala.collection.immutable.Range$$anon$1.foreach(Range.scala:274)
>>>>>      at
>>>>>
>> scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:227)
>>>>>      at scala.collection.immutable.Range.flatMap(Range.scala:39)
>>>>>      at kafka.api.ProducerRequest$.readFrom(ProducerRequest.scala:38)
>>>>>      at kafka.api.RequestKeys$$anonfun$1.apply(RequestKeys.scala:34)
>>>>>      at kafka.api.RequestKeys$$anonfun$1.apply(RequestKeys.scala:34)
>>>>>      at
>>>>> kafka.network.RequestChannel$Request.<init>(RequestChannel.scala:49)
>>>>>      at kafka.network.Processor.read(SocketServer.scala:345)
>>>>>      at kafka.network.Processor.run(SocketServer.scala:245)
>>>>>      at java.lang.Thread.run(Thread.java:680)
>>>>>
>>>>>
>>
>>