Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> Guava 15


That means more or less backporting the patch to the 0.94, no?
It should work imho.
On Mon, Dec 16, 2013 at 3:16 PM, Kristoffer Sjögren <[EMAIL PROTECTED]>wrote:

> Thanks! But we cant really upgrade to HBase 0.96 right now, but we need to
> go to Guava 15 :-(
>
> I was thinking of overriding the classes fixed in the patch in our test
> environment.
>
> Could this work maybe?
>
>
> On Mon, Dec 16, 2013 at 11:01 AM, Kristoffer Sjögren <[EMAIL PROTECTED]
> >wrote:
>
> > Hi
> >
> > At the moment HFileWriterV2.close breaks at startup when using Guava 15.
> > This is not a client problem - it happens because we start a master node
> to
> > do integration tests.
> >
> > A bit precarious and wonder if there are any plans to support Guava 15,
> or
> > if there are clever way around this?
> >
> > Cheers,
> > -Kristoffer
> >
> > org.apache.hadoop.hbase.DroppedSnapshotException: region: -ROOT-,,0
> >       at
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1646)
> >       at
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1514)
> >       at
> org.apache.hadoop.hbase.regionserver.HRegion.doClose(HRegion.java:1032)
> >       at
> org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:980)
> >       at
> org.apache.hadoop.hbase.regionserver.HRegion.close(HRegion.java:951)
> >       at
> org.apache.hadoop.hbase.master.MasterFileSystem.bootstrap(MasterFileSystem.java:523)
> >       at
> org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:463)
> >       at
> org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:148)
> >       at
> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:133)
> >       at
> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:549)
> >       at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:408)
> >       at
> org.apache.hadoop.hbase.master.HMasterCommandLine$LocalHMaster.run(HMasterCommandLine.java:226)
> >       at java.lang.Thread.run(Thread.java:722)
> > Caused by: java.lang.NoClassDefFoundError:
> com/google/common/io/NullOutputStream
> >       at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.close(HFileWriterV2.java:375)
> >       at
> org.apache.hadoop.hbase.regionserver.StoreFile$Writer.close(StoreFile.java:1299)
> >       at
> org.apache.hadoop.hbase.regionserver.Store.internalFlushCache(Store.java:897)
> >       at
> org.apache.hadoop.hbase.regionserver.Store.flushCache(Store.java:778)
> >       at
> org.apache.hadoop.hbase.regionserver.Store$StoreFlusherImpl.flushCache(Store.java:2290)
> >       at
> org.apache.hadoop.hbase.regionserver.HRegion.internalFlushcache(HRegion.java:1621)
> >       ... 12 more
> > Caused by: java.lang.ClassNotFoundException:
> com.google.common.io.NullOutputStream
> >       at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> >       at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >       at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
> >       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> >       at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
> >       ... 18 more
> >
> >
>