Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase, mail # user - How to install Snappy?


Copy link to this message
-
Re: How to install Snappy?
Mohamed Ibrahim 2012-12-03, 02:10
I got stuck like you before trying to setup snappy on hbase. Here what I
recall I did from memory:

I think you need to copy the snappy libs inside the hadoop lib/native, then
point hbase to it.

So in hbase/conf/hbase.env , you should have something similar to this:
export HADOOP_HOME=<path-to-hadoop>/hadoop/
export HBASE_LIBRARY_PATH=<path-to-hadoop>/hadoop/lib/

Inside the hadoop folder
"<path-to-hadoop>/hadoop/lib/native/Linux-amd64-64/", that's what I have
now:
libhadoop.a         libhadooppipes.a    libhadoop.so.1
 libhadooputils.a    libhdfs.la          libhdfs.so.0        libsnappy.so
     libsnappy.so.1.1.3
libhadoop.la        libhadoop.so        libhadoop.so.1.0.0  libhdfs.a
    libhdfs.so          libhdfs.so.0.0.0    libsnappy.so.1

and I think I had to copy those files from somewhere else inside hadoop to
this folder, as well as snappy libs for snappy to work.

BTW... the snappy debian packages did not work. I had to download the
snappy library from here http://code.google.com/p/snappy/downloads/list ,
make and copy the libs inside hadoop.

I'm using hadoop and hbase from the tarballs on apache's website.

Let me know if you're still stuck. I will try to redo what I did before in
a local version.

Best,
Mohamed
On Sun, Dec 2, 2012 at 6:52 PM, Håvard Wahl Kongsgård <
[EMAIL PROTECTED]> wrote:

> are you using cloudera CD3, if so you only need to install
> hadoop-0.20-native
>
> On Sun, Dec 2, 2012 at 12:57 AM, Jean-Marc Spaggiari
> <[EMAIL PROTECTED]> wrote:
> > Sorry, I forgot to paste few maybe useful lines. I have the lib in
> > /usr/local/lib copied properly, and I have the HBASE_LIBRARY_PATH set
> > correctly. Do I need to restart HBase to run this test?
> >
> > hbase@node3:~/hbase-0.94.2$ export HBASE_LIBRARY_PATH=/usr/local/lib/
> > hbase@node3:~/hbase-0.94.2$ bin/hbase
> > org.apache.hadoop.hbase.util.CompressionTest /tmp/test.txt snappy
> > 12/12/01 18:55:29 INFO util.ChecksumType:
> > org.apache.hadoop.util.PureJavaCrc32 not available.
> > 12/12/01 18:55:29 INFO util.ChecksumType: Checksum can use
> java.util.zip.CRC32
> > 12/12/01 18:55:29 INFO util.ChecksumType:
> > org.apache.hadoop.util.PureJavaCrc32C not available.
> > 12/12/01 18:55:29 DEBUG util.FSUtils: Creating file:/tmp/test.txtwith
> > permission:rwxrwxrwx
> > 12/12/01 18:55:29 WARN util.NativeCodeLoader: Unable to load
> > native-hadoop library for your platform... using builtin-java classes
> > where applicable
> > 12/12/01 18:55:29 WARN metrics.SchemaConfigured: Could not determine
> > table and column family of the HFile path /tmp/test.txt. Expecting at
> > least 5 path components.
> > 12/12/01 18:55:29 WARN snappy.LoadSnappy: Snappy native library is
> available
> > 12/12/01 18:55:29 WARN snappy.LoadSnappy: Snappy native library not
> loaded
> > Exception in thread "main" java.lang.RuntimeException: native snappy
> > library not available
> >         at
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:123)
> >         at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:100)
> >         at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:112)
> >         at
> org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:264)
> >         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:739)
> >         at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:127)
> >         at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:118)
> >         at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:101)
> >         at
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:394)
> >         at
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:108)
> >         at
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:138)
> > hbase@node3:~/hbase-0.94.2$ ll /usr/local/lib/