-Re: Kafka Hadoop Consumer for multiple brokers
Neha Narkhede 2013-06-04, 16:07
It is probably possible to try to make the serde format pluggable. You can
try to start that discussion on the camus mailing list.
On Jun 4, 2013 8:51 AM, "Samir Madhavan" <
[EMAIL PROTECTED]> wrote:
> Thanks Jun.
> For one of our project, the data is coming in the binary proto format.
> Camus is based on Avro so what would be the best situation to handle to
> protobuf data files?
> On Tue, Jun 4, 2013 at 8:40 PM, Jun Rao <[EMAIL PROTECTED]> wrote:
> > The idea is that each mapper is only connecting to a single Kafka broker.
> > Each line in the input file specifies broker uri, topic, partition and
> > offset.
> > The hadoop consumer in contrib is probably a bit outdated. The one that
> > LinkedIn uses now can be found at https://github.com/linkedin/camus
> > Thanks,
> > Jun
> > On Tue, Jun 4, 2013 at 7:29 AM, Samir Madhavan <
> > [EMAIL PROTECTED]> wrote:
> > > Hi,
> > >
> > > I was going through the hadoop-consumer in the contrib folder. There is
> > > property that asks for the kafka server URI. This might sound silly but
> > > from looking at it, it seems to be only for a single kafka broker.
> > >
> > > What we have multiple brokers, how do we implement the hadoop-consumer
> > for
> > > it?
> > >
> > > Regards,
> > > Samir
> > >
> *Samir Madhavan *| Data Scientist | Flutura Business Solutions Pvt. Ltd |
> Floor, 'Geetanjali', #693, 15th Cross, J.P Nagar 2nd Phase, Bangalore,
> India - 560078 | Mobile: +91 9886139631 | email: *
> [EMAIL PROTECTED]*| www.fluturasolutions.com |