amit nanda 2013-08-19, 16:12
Harsh J 2013-08-22, 06:31
Bill Baird 2013-08-22, 14:47
-Re: Avro file Compression
Scott Carey 2013-08-22, 21:35
The file format compresses in blocks, and the block size is configurable.
This will compress across objects in a block, so it works for small objects
as well as large ones as long as the total block size is large enough.
I have found that I can increase the ratio of compression by ordering the
objects carefully so that neighbor records have more in common.
From: Bill Baird <[EMAIL PROTECTED]>
Reply-To: "[EMAIL PROTECTED]" <[EMAIL PROTECTED]>
Date: Thursday, August 22, 2013 7:47 AM
To: "[EMAIL PROTECTED]" <[EMAIL PROTECTED]>
Subject: Re: Avro file Compression
As with any compression, how much you get depends on the size and nature of
the data. I have objects where unserialized they take 4 or 5k, and they
serialize to 1.5 to 3k, or about 2 to 1. However, for the same object
structure (which contains several nested arrays ... lots of strings, numbers
... basic business data) when uncompressed it 17MB, it deflates to 1MB (or
17 to 1). For very small objects, deflate will actually produce a larger
output, but it does quite well as the size of the data being deflated grows.
On Wed, Aug 21, 2013 at 11:31 PM, Harsh J <[EMAIL PROTECTED]> wrote:
> Can you share your test? There is an example at
> which has the right calls for using a file writer with a deflate codec
> - is yours similar?
> On Mon, Aug 19, 2013 at 9:42 PM, amit nanda <[EMAIL PROTECTED]> wrote:
>> > I am try to compress the avro files that i am writing, for that i am using
>> > the latest Avro C, with "deflate" option, but i am not able to see any
>> > difference in the file size.
>> > Is there any special type to data that this works on, or is there any more
>> > setting that needs to be done for this to work.
> Harsh J