Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> Snappy compression not working with HBase 0.98.3


Copy link to this message
-
Snappy compression not working with HBase 0.98.3
Hi All,

Recently i have upgraded HBase environment from 0.94 to 0.98.3. Now trying
to use snappy compression with it.

I have installed snappy library as per guide mentioned in
https://hbase.apache.org/book/snappy.compression.html

When i am creating a table with snappy compression enabled, i am getting
below error:
hbase(main):001:0> create 'test', {NAME=>'cf1', COMPRESSION=>'SNAPPY'}
2014-07-08 20:06:33,265 WARN  [main] util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes where
applicable

ERROR: java.io.IOException: Compression algorithm 'snappy' previously
failed test.
at
org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionTest.java:85)
at
org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1774)
at
org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1767)
at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1749)
at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1784)
at
org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40470)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2012)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
at
org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:73)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:722)
Please let me know if anyone aware of this issue.

*Thanks & Regards*
*Hanish Bansal*