Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> Compression  class loading mismatch in 0.94.2


Copy link to this message
-
Re: Compression class loading mismatch in 0.94.2
The change happened in 0.94.5

Please see HBASE-5458

Cheers

On Sun, Jun 9, 2013 at 5:35 AM, Levy Meny <[EMAIL PROTECTED]> wrote:

> Hi,
> Anyone knows why org.apache.hadoop.hbase.io.hfile.Compression has changed
> in 0.94.2 to use the SystemClassLoader to load the snappy class, instead of
> the ContextClassLoader in previous versions (e.g. in 0.92.1)?
>
>       private CompressionCodec buildCodec(Configuration conf) {
>         try {
>           Class<?> externalCodec > ClassLoader.getSystemClassLoader().loadClass("org.apache.hadoop.io.compress.SnappyCodec");
>           return (CompressionCodec)
> ReflectionUtils.newInstance(externalCodec,conf);
>
> (btw you will notice that ContextClassLoader is still used for loading
> e.g. Lz4Codec)
>
> The error I got:
>
> 2013-05-31 00:01:25,704 [ERROR] [BulkImportManager-2-thread-1]
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles
> (LoadIncrementalHFiles.java:343) - Unexpected execution exception during
> splitting
> java.util.concurrent.ExecutionException: java.lang.RuntimeException:
> java.lang.ClassNotFoundException: org.apache.hadoop.io.compress.SnappyCodec
>         at java.util.concurrent.FutureTask$Sync.innerGet(Unknown Source)
>         at java.util.concurrent.FutureTask.get(Unknown Source)
>         at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrSplitPhase(LoadIncrementalHFiles.java:333)
>         at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.doBulkLoad(LoadIncrementalHFiles.java:232)
> ...
> Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException:
> org.apache.hadoop.io.compress.SnappyCodec
>         at
> org.apache.hadoop.hbase.io.hfile.Compression$Algorithm$4.buildCodec(Compression.java:207)
>         at
> org.apache.hadoop.hbase.io.hfile.Compression$Algorithm$4.getCodec(Compression.java:192)
>         at
> org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:302)
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:745)
>         at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:134)
>         at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:125)
>         at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:105)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:394)
>         at
> org.apache.hadoop.hbase.regionserver.StoreFile$Writer.<init>(StoreFile.java:1003)
>         at
> org.apache.hadoop.hbase.regionserver.StoreFile$Writer.<init>(StoreFile.java:948)
>         at
> org.apache.hadoop.hbase.regionserver.StoreFile$WriterBuilder.build(StoreFile.java:851)
>         at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.copyHFileHalf(LoadIncrementalHFiles.java:541)
>         at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.splitStoreFile(LoadIncrementalHFiles.java:514)
>         at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.splitStoreFile(LoadIncrementalHFiles.java:375)
>         at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.groupOrSplit(LoadIncrementalHFiles.java:439)
>        at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles$2.call(LoadIncrementalHFiles.java:323)
>         at
> org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles$2.call(LoadIncrementalHFiles.java:321)
>         at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
>         at java.util.concurrent.FutureTask.run(Unknown Source)
>         ... 3 more
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.io.compress.SnappyCodec
>         at java.net.URLClassLoader$1.run(Unknown Source)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(Unknown Source)
>         at java.lang.ClassLoader.loadClass(Unknown Source)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB