Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce >> mail # user >> Hadoop's Avro dependencies.


+
Rahul Bhattacharjee 2012-08-22, 05:39
+
Harsh J 2012-08-22, 05:52
+
Rahul Bhattacharjee 2012-08-22, 06:12
+
Harsh J 2012-08-22, 06:16
+
Bertrand Dechoux 2012-08-22, 06:27
+
Harsh J 2012-08-22, 07:46
Copy link to this message
-
Re: Hadoop's Avro dependencies.
Thanks a lot Harsh! I should get the code instead of bothering people here
in the group.

Rgds,
Rahul

On Wed, Aug 22, 2012 at 11:46 AM, Harsh J <[EMAIL PROTECTED]> wrote:

> Hi,
>
> By default, only the Writable serialization technique is used. If you
> choose to use Avro in your job, only then Avro serialization is
> utilized at the intermediate serialization step.
>
> On Wed, Aug 22, 2012 at 11:42 AM, Rahul Bhattacharjee
> <[EMAIL PROTECTED]> wrote:
> > Well , thanks a lot Harsh. I though avro was result of hadoop's
> > serialization needs.
> >
> > If avro isn't used for serializing maps outputs and transfer it to other
> > reducers then whats used for this , if not avro.
> >
> > Thanks,
> > Rahul
> >
> > On Wed, Aug 22, 2012 at 11:22 AM, Harsh J <[EMAIL PROTECTED]> wrote:
> >>
> >> Hi,
> >>
> >> Hadoop doesn't use Avro serialization on its own. However, Hadoop 2.x
> >> does provide an AvroSerialization class you can use optionally to
> >> serialize using Avro libraries, and the 2.x distribution does ship an
> >> Avro jar along with it.
> >>
> >> On Wed, Aug 22, 2012 at 11:09 AM, Rahul Bhattacharjee
> >> <[EMAIL PROTECTED]> wrote:
> >> > Hi,
> >> >
> >> > I was going through the Apache Hadoop's distribution dependencies
> (jars
> >> > in
> >> > lib folder) and I could not find avro-1.x.x.jar.
> >> >
> >> > I though hadoop internally uses avro as its serialization mechanism
> for
> >> > intermediate data transmission (transporting maps output to reducers
> etc
> >> > ),
> >> > so hadoop distribution must have avro within it. But it doesn't !
> >> >
> >> > Can someone enlighten me on this?
> >> >
> >> > Thanks,
> >> > Rahul
> >> >
> >>
> >>
> >>
> >> --
> >> Harsh J
> >
> >
>
>
>
> --
> Harsh J
>