Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Kafka >> mail # user >> Kafka 0.8.0 in maven


Copy link to this message
-
Re: Kafka 0.8.0 in maven
Yes you are correct. I mis-understood the feature.
auto.create.topics.enable has an interesting side effect. Say your kafka
configuration file allows for 10 partitions of a topic. If the topic
auto-creates you get 10 partitions. Doing it the way I did it gives you
control of the number of partitions created regardless of how kafka is
configured.
On Thu, Nov 7, 2013 at 4:16 AM, Chris Bedford <[EMAIL PROTECTED]> wrote:

> auto.create.topics.enable
> is true by default.   For this test  I relied on that property.    I don't
> think a real production class should rely on that though.. Too easy to mess
> things up with a typo    - cb
>
>
> On Wed, Nov 6, 2013 at 9:28 PM, Edward Capriolo <[EMAIL PROTECTED]
> >wrote:
>
> > One thing I noticed about your code. I thought in kafka 0.8.0 topics are
> > not created automatically on first message. I do not see anywhere in your
> > code which creates the topics.
> >
> > I am creating the topic as part of my tests.
> >
> > Before I was not setting that property, and getting some of the messages.
> > Which is really weird. You are using a slightly different kafka then me
> > (your pom.xml vs mine) as well.
> >
> > Here is what I have now:
> >
> >
> >
> https://github.com/edwardcapriolo/IronCount/blob/iron-ng/src/test/java/com/jointhegrid/ironcount/IronIntegrationTest.java
> >
> >
> https://github.com/edwardcapriolo/IronCount/blob/iron-ng/src/test/java/com/jointhegrid/ironcount/IntegrationTest.java
> >
> > The property I mentioned is making all my tests happy, so that was the
> > magic bullet for me. Everything else I did in the code above clean up
> wise
> > had no effect. I did all the cleanups above in the code and nothing was
> > working till I switched on that param.
> >
> >
> > On Thu, Nov 7, 2013 at 12:15 AM, Chris Bedford <[EMAIL PROTECTED]
> > >wrote:
> >
> > > Do you need to use that configuration to get the tests (as currently
> > > checked in) to pass ?    I did not find i needed that particular knob
> > >  (although it is a good one to know about).
> > >
> > > and Sorry about your suffering... I can sympathize !
> > >
> > >  - cb
> > >
> > >
> > > On Wed, Nov 6, 2013 at 7:38 PM, Edward Capriolo <[EMAIL PROTECTED]
> > > >wrote:
> > >
> > > > After about 5 days of relentless head pounding, and twittling about
> > > > everything under the sun, I figured it out.
> > > >
> > > > If you read to the bottom of this page:
> > > >
> > https://cwiki.apache.org/confluence/display/KAFKA/Consumer+Group+Example
> > > >
> > > > You find:
> > > >
> > > > consumerProps.put("auto.offset.reset", "smallest");
> > > >
> > > > Now I can bring up the entire stack in jvm and test like I used to. !
> > > >
> > > >
> > > > On Wed, Nov 6, 2013 at 2:52 AM, Chris Bedford <[EMAIL PROTECTED]
> >
> > > > wrote:
> > > >
> > > > > Hi, Edward..
> > > > >
> > > > > yup .. you are correct.. when we get to a little over 1000 messages
> > the
> > > > > program was  failing with the  exception stack trace i included
> > below.
> > > > >
> > > > > I fixed the test so it passes as long as the consumer gets all
> > messages
> > > > > sent by the producer.. even if an exception is thrown during shut
> > > down..
> > > > >
> > > > > This isn't as clean as i'd like it to be.  I tried
> > > > > calling kafkaServer.awaitShutdown();
> > > > > And I tried inserting some Thread.sleep() calls to give the
> consumer
> > > and
> > > > > producer shut down procedures some time to complete. But I still
> got
> > > the
> > > > > stack trace shown below.
> > > > >
> > > > > I don't have time to chase the bug any further.. But I did correct
> > the
> > > > > test, so you can pull it and see that it passes if you want.
> > > > >
> > > > > Maybe we should file a bug on this...?        It might be that I'm
> > > using
> > > > > the API incorrectly. I'm not sure at this point.
> > > > >
> > > > > anyway, thx for informing me of the issue.
> > > > >
> > > > >
> > > > >
> > > > >
> > > > > *Failure due to broken shut down>> >*