Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
HBase >> mail # user >> How to install Snappy?


+
Jean-Marc Spaggiari 2012-12-01, 23:52
+
Jean-Marc Spaggiari 2012-12-01, 23:57
+
Jean-Marc Spaggiari 2012-12-02, 13:25
Copy link to this message
-
Re: How to install Snappy?
hope it helps. this is what I do on apache hadoop 1.0.x and hbase 0.92.y:
in hbase-site.xml add:

<property>
<name>hbase.regionserver.codecs</name>
<value>snappy</value>
</property>

copy that file into the hadoop conf directory.

in hbase-env.sh:
export HBASE_LIBRARY_PATH=/pathtoyourhadoop/lib/native/Linux-amd64-64

( In hbase-env.sh I set also HBASE_HOME, HBASE_CONF_DIR, HADOOP_HOME,
HADOOP_CONF_DIR but I don't know if they contribute to make snappy
working...)

in /pathtoyourhadoop/lib/native/Linux-amd64-64 I have:
libsnappy.a
libsnappy.so
libsnappy.so.1
libsnappy.so.1.1.2

good luck
giovanni
On 12/02/2012 02:25 PM, Jean-Marc Spaggiari wrote:
> So. I spent few hours on that yesterday with no luck.
>
> Here is what I did:
> - Install the google tar, untared, configured, maked and installed it.
> - Copied the .so files all over my fs in the os lib dir,
> HBase/lib/native and subdirs, Hadoop/lib/native and subdirs.
> - Installed all debian packages with snappy in the name:
> python-snappy, libsnappy-dev, libsnappy1, libsnappy-java
>
> But still exactly the same issue as above. And I don't have any clue
> where to dig. There is nothing on internet about that.
>
> Anyone faced that already while installing Snappy?
>
> JM
>
> 2012/12/1, Jean-Marc Spaggiari <[EMAIL PROTECTED]>:
>> Sorry, I forgot to paste few maybe useful lines. I have the lib in
>> /usr/local/lib copied properly, and I have the HBASE_LIBRARY_PATH set
>> correctly. Do I need to restart HBase to run this test?
>>
>> hbase@node3:~/hbase-0.94.2$ export HBASE_LIBRARY_PATH=/usr/local/lib/
>> hbase@node3:~/hbase-0.94.2$ bin/hbase
>> org.apache.hadoop.hbase.util.CompressionTest /tmp/test.txt snappy
>> 12/12/01 18:55:29 INFO util.ChecksumType:
>> org.apache.hadoop.util.PureJavaCrc32 not available.
>> 12/12/01 18:55:29 INFO util.ChecksumType: Checksum can use
>> java.util.zip.CRC32
>> 12/12/01 18:55:29 INFO util.ChecksumType:
>> org.apache.hadoop.util.PureJavaCrc32C not available.
>> 12/12/01 18:55:29 DEBUG util.FSUtils: Creating file:/tmp/test.txtwith
>> permission:rwxrwxrwx
>> 12/12/01 18:55:29 WARN util.NativeCodeLoader: Unable to load
>> native-hadoop library for your platform... using builtin-java classes
>> where applicable
>> 12/12/01 18:55:29 WARN metrics.SchemaConfigured: Could not determine
>> table and column family of the HFile path /tmp/test.txt. Expecting at
>> least 5 path components.
>> 12/12/01 18:55:29 WARN snappy.LoadSnappy: Snappy native library is
>> available
>> 12/12/01 18:55:29 WARN snappy.LoadSnappy: Snappy native library not loaded
>> Exception in thread "main" java.lang.RuntimeException: native snappy
>> library not available
>> at
>> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:123)
>> at
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:100)
>> at
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:112)
>> at
>> org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:264)
>> at
>> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:739)
>> at
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:127)
>> at
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:118)
>> at
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:101)
>> at
>> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:394)
>> at
>> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:108)
>> at
>> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:138)
>> hbase@node3:~/hbase-0.94.2$ ll /usr/local/lib/
>> total 572
>> -rw-r--r-- 1 root staff 391614 déc  1 18:33 libsnappy.a
>> -rwxr-xr-x 1 root staff    957 déc  1 18:33 libsnappy.la
>> lrwxrwxrwx 1 root staff     18 déc  1 18:33 libsnappy.so ->
>> libsnappy.so.1.1.3
>> lrwxrwxrwx 1 root staff     18 déc  1 18:33 libsnappy.so.1 ->
+
Jean-Marc Spaggiari 2012-12-03, 13:24
+
Jean-Marc Spaggiari 2012-12-03, 13:47
+
Kevin Odell 2012-12-03, 14:09
+
Kevin Odell 2012-12-03, 14:37
+
Jean-Marc Spaggiari 2012-12-03, 14:56
+
Jean-Marc Spaggiari 2012-12-03, 15:15
+
Kevin Odell 2012-12-03, 15:19
+
Jean-Marc Spaggiari 2012-12-03, 15:50
+
surfer 2012-12-04, 06:29
+
Jean-Marc Spaggiari 2012-12-04, 12:27
+
ac@...) 2012-12-03, 14:22
+
ac@...) 2012-12-03, 14:29
+
Stack 2012-12-03, 18:20
+
Jean-Marc Spaggiari 2012-12-03, 18:55
+
Jean-Marc Spaggiari 2012-12-03, 19:48
+
Stack 2012-12-03, 20:15
+
ac@...) 2012-12-02, 06:16
+
Håvard Wahl Kongsgård 2012-12-02, 23:52
+
Arati Patro 2012-12-03, 05:41
+
Mohamed Ibrahim 2012-12-03, 02:10
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB