Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> Concurrently Reading Still Got Exceptions


Copy link to this message
-
Re: Concurrently Reading Still Got Exceptions
This could be a case where the RPC server response got closed and we try to
act on that which got nullified.  Remember some similar issues fixed on
this part.  Don't remember whether it is 0.92 version.

On Sat, Mar 2, 2013 at 11:14 PM, Anoop John <[EMAIL PROTECTED]> wrote:

> Is this really related to concurrent reads?  I think some thing else..
> Will dig into code tomorrow.  Can you attach a junit test case which will
> produce NPE.
>
> -Anoop-
>
>
> On Sat, Mar 2, 2013 at 9:29 PM, Ted Yu <[EMAIL PROTECTED]> wrote:
>
> > Looks like the issue might be related to HTable:
> >
> >           at org.apache.hadoop.hbase.client.HTable$ClientScanner.
> > nextScanner(HTable.java:1167)
> >           at org.apache.hadoop.hbase.client.HTable$ClientScanner.
> > next(HTable.java:1296)
> >           at org.apache.hadoop.hbase.client.HTable$ClientScanner$1.
> > hasNext(HTable.java:1356)
> >
> > In newer version of HBase (0.94), you can pass executor to HTable ctor so
> > that you don't need to use HTablePool:
> >
> >   public HTable(Configuration conf, final byte[] tableName,
> > finalExecutorService pool)
> >
> > Cheers
> >
> > On Wed, Feb 6, 2013 at 2:27 AM, Bing Li <[EMAIL PROTECTED]> wrote:
> >
> > > Dear all,
> > >
> > > Some exceptions are raised when I concurrently read data from HBase.
> > > The version of HBase I used is 0.92.0.
> > >
> > > I cannot fix the problem. Could you please help me?
> > >
> > > Thanks so much!
> > >
> > > Best wishes,
> > > Bing
> > >
> > >       Feb 6, 2013 12:21:31 AM
> > > org.apache.hadoop.hbase.ipc.HBaseClient$Connection run
> > >       WARNING: Unexpected exception receiving call responses
> > > java.lang.NullPointerException
> > >           at
> > >
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:521)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readFields(HbaseObjectWritable.java:297)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:593)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:505)
> > >       Feb 6, 2013 12:21:31 AM
> > > org.apache.hadoop.hbase.client.ScannerCallable close
> > >       WARNING: Ignore, probably already closed
> > >       java.io.IOException: Call to greatfreeweb/127.0.1.1:60020
> > > failed on local exception: java.io.IOException: Unexpected exception
> > > receiving call responses
> > >           at
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient.wrapException(HBaseClient.java:934)
> > >           at
> > > org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:903)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:150)
> > >           at $Proxy6.close(Unknown Source)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.client.ScannerCallable.close(ScannerCallable.java:112)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:74)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:39)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getRegionServerWithRetries(HConnectionManager.java:1325)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.client.HTable$ClientScanner.nextScanner(HTable.java:1167)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.client.HTable$ClientScanner.next(HTable.java:1296)
> > >           at
> > >
> >
> org.apache.hadoop.hbase.client.HTable$ClientScanner$1.hasNext(HTable.java:1356)
> > >           at
> > >
> >
> com.greatfree.hbase.rank.NodeRankRetriever.loadNodeGroupNodeRankRowKeys(NodeRankRetriever.java:348)
> > >           at
> > >
> >
> com.greatfree.ranking.PersistNodeGroupNodeRanksThread.run(PersistNodeGroupNodeRanksThread.java:29)
> > >           at
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)