Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Pig >> mail # user >> Snappy compression with pig


+
Mohit Anchlia 2012-04-26, 19:32
+
Mohit Anchlia 2012-04-26, 19:40
Copy link to this message
-
Re: Snappy compression with pig
Have you tried setting output compression to Snappy for Store?

grunt> set output.compression.enabled true;
grunt> set output.compression.codec org.apache.hadoop.io.compress.SnappyCodec;

You should be able to read and write Snappy compressed files with
PigStorage which uses Hadoop TextInputFormat internally.

Thanks,
Prashant
On Thu, Apr 26, 2012 at 12:40 PM, Mohit Anchlia <[EMAIL PROTECTED]>wrote:

> I think I need to write both store and load functions. It appears that only
> intermediate output that is stored on temp location can be compressed
> using:
>
> SET mapred.compress.map.output true;
>
> SET mapred.output.compression org.apache.hadoop.io.compress.SnappyCodec;
>
>
>
> Any pointers as to how I can store and load using snappy would be helpful.
> On Thu, Apr 26, 2012 at 12:32 PM, Mohit Anchlia <[EMAIL PROTECTED]
> >wrote:
>
> > I am able to write with Snappy  compression. But I don't think pig
> > provides anything to read such records. Can someone suggest or point me
> to
> > relevant code that might help me write LoadFunc for it?
>
+
Mohit Anchlia 2012-04-29, 20:06
+
Prashant Kommireddi 2012-04-29, 20:12
+
Mohit Anchlia 2012-04-29, 20:41
+
Prashant Kommireddi 2012-04-30, 02:33
+
Mohit Anchlia 2012-04-30, 23:15
+
Prashant Kommireddi 2012-05-01, 00:38