Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> Snappy compression question


Copy link to this message
-
Re: Snappy compression question
Rural:
If you're busy, J-M or I can open the JIRA.

Having snappy support properly documented would be helpful to hadoop and
hbase users.

Cheers
On Sat, Jan 4, 2014 at 4:23 AM, Jean-Marc Spaggiari <[EMAIL PROTECTED]
> wrote:

> Hi Rural,
>
> If you have any recommendation on the way to complete it, it's totally
> welcome. Please open a JIRA to document what should be done, and take a
> look here: http://hbase.apache.org/book/submitting.patches.html to know
> how
> to provide a patch to update the documentation accordingly.
>
> Thanks,
>
> JM
>
>
> 2014/1/4 Rural Hunter <[EMAIL PROTECTED]>
>
> > The document is far from complete. It didn't mention the default hadoop
> > binary package is compiled without snappy support and you need to compile
> > it with snappy option yourself. Actually it didn't work with any native
> > libs on 64 bits OS as the libhadoop.so in the binary package is only for
> 32
> > bits OS. It also din't metion actually you need both snappy and
> > hadoop-snappy.
> >
> > 于 2014/1/3 19:20, 张玉雪 写道:
> >
> >  Hi:
> >>
> >>           When I used hadoop 2.2.0 and hbase 0.96.1.1 to using snappy
> >> compression
> >>
> >>           I followed the topic http://hbase.apache.org/book/
> >> snappy.compression.html,  but I get some error ,can some one help me?
> >>
> >>
> >> [hadoop@master bin]$ hbase org.apache.hadoop.hbase.util.CompressionTest
> >> file:///tmp/test222.txt snappy
> >>
> >> 2014-01-03 19:12:41,971 INFO  [main] Configuration.deprecation:
> >> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> >>
> >> SLF4J: Class path contains multiple SLF4J bindings.
> >>
> >> SLF4J: Found binding in [jar:file:/home/hadoop/hbase-
> >> 0.96.1.1-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/
> >> impl/StaticLoggerBinder.class]
> >>
> >> SLF4J: Found binding in [jar:file:/home/hadoop/hadoop-
> >> 2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/
> >> slf4j/impl/StaticLoggerBinder.class]
> >>
> >> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >> explanation.
> >>
> >> 2014-01-03 19:12:42,663 INFO  [main] util.ChecksumType: Checksum using
> >> org.apache.hadoop.util.PureJavaCrc32
> >>
> >> 2014-01-03 19:12:42,670 INFO  [main] util.ChecksumType: Checksum can use
> >> org.apache.hadoop.util.PureJavaCrc32C
> >>
> >> Exception in thread "main" java.lang.RuntimeException: native snappy
> >> library not available: this version of libhadoop was built without
> snappy
> >> support.
> >>
> >>          at org.apache.hadoop.io.compress.SnappyCodec.
> >> checkNativeCodeLoaded(SnappyCodec.java:63)
> >>
> >>          at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(
> >> SnappyCodec.java:131)
> >>
> >>          at org.apache.hadoop.io.compress.CodecPool.getCompressor(
> >> CodecPool.java:147)
> >>
> >>          at org.apache.hadoop.io.compress.CodecPool.getCompressor(
> >> CodecPool.java:162)
> >>
> >>          at org.apache.hadoop.hbase.io.compress.Compression$
> >> Algorithm.getCompressor(Compression.java:312)
> >>
> >>          at org.apache.hadoop.hbase.io.encoding.
> >> HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingConte
> >> xt.java:79)
> >>
> >>          at org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>
> >> (HFileBlock.java:719)
> >>
> >>          at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.
> >> finishInit(HFileWriterV2.java:131)
> >>
> >>          at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(
> >> HFileWriterV2.java:122)
> >>
> >>          at org.apache.hadoop.hbase.io.hfile.HFileWriterV2$
> >> WriterFactoryV2.createWriter(HFileWriterV2.java:105)
> >>
> >>          at org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.
> >> create(HFile.java:426)
> >>
> >>          at org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(
> >> CompressionTest.java:115)
> >>
> >>          at org.apache.hadoop.hbase.util.CompressionTest.main(
> >> CompressionTest.java:145)
> >>
> >>
> >>
> >> I have  installed snappy 1.2.0 successfully and hadoop-snappy
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB