Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> Custom HBase Filter : Error in readFields


Copy link to this message
-
Re: Custom HBase Filter : Error in readFields
Here[1] is the code for the filter.

-Bryan

[1] - http://pastebin.com/5Qjas88z

> Bryan:
> Looks like you may have missed adding unit test for your filter.
>
> Unit test should have caught this situation much earlier.
>
> Cheers
>
> On Wed, Feb 20, 2013 at 3:42 PM, Viral Bajaria <[EMAIL PROTECTED]
>wrote:
>
> > Also the readFields is your implementation of how to read the byte array
> > transferred from the client. So I think there has to be some issue in
how
> > you write the byte array to the network and what you are reading out of
> > that i.e. the size of arrays might not be identical.
> >
> > But as Ted mentioned, looking at the code will help troubleshoot it
better.
> >
> > On Wed, Feb 20, 2013 at 3:32 PM, Ted Yu <[EMAIL PROTECTED]> wrote:
> >
> > > If you show us the code for RowRangeFilter, that would help us
> > > troubleshoot.
> > >
> > > Cheers
> > >
> > > On Wed, Feb 20, 2013 at 2:05 PM, Bryan Baugher <[EMAIL PROTECTED]>
wrote:
> > >
> > > > Hi everyone,
> > > >
> > > > I am trying to write my own custom Filter but I have been having
> > issues.
> > > > When there is only 1 region in my table the scan works as expected
but
> > > when
> > > > there is more, it attempts to create a new version of my filter and
> > > > deserialize the information again but the data seems to be gone. I
am
> > > > running HBase 0.92.1-cdh4.1.1.
> > > >
> > > > 2013-02-20 15:39:53,220 DEBUG
com.cerner.kepler.filters.RowRangeFilter:
> > > > Reading fields
> > > > 2013-02-20 15:40:08,612 WARN org.apache.hadoop.hbase.util.Sleeper:
We
> > > slept
> > > > 15346ms instead of 3000ms, this is likely due to a long garbage
> > > collecting
> > > > pause and it's usually bad, see
> > > > http://hbase.apache.org/book.html#trouble.rs.runtime.zkexpired
> > > > 2013-02-20 15:40:09,142 ERROR
> > > > org.apache.hadoop.hbase.io.HbaseObjectWritable: Error in readFields
> > > > java.lang.ArrayIndexOutOfBoundsException
> > > >         at java.lang.System.arraycopy(Native Method)
> > > >         at
> > > java.io.ByteArrayInputStream.read(ByteArrayInputStream.java:174)
> > > >         at
java.io.DataInputStream.readFully(DataInputStream.java:178)
> > > >         at
java.io.DataInputStream.readFully(DataInputStream.java:152)
> > > >         at
> > > >
> > > >
> > >
> >
com.cerner.kepler.filters.RowRangeFilter.readFields(RowRangeFilter.java:226)
> > > >         at
> > org.apache.hadoop.hbase.client.Scan.readFields(Scan.java:548)
> > > >         at
> > > >
> > > >
> > >
> >
org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
> > > >         at
> > > >
org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:125)
> > > >         at
> > > >
> > > >
> > >
> >
org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:1254)
> > > >         at
> > > >
> > > >
> > >
> >
org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1183)
> > > >         at
> > > >
> > > >
> > >
> >
org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:719)
> > > >         at
> > > >
> > > >
> > >
> >
org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:511)
> > > >         at
> > > >
> > > >
> > >
> >
org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:486)
> > > >         at
> > > >
> > > >
> > >
> >
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> > > >         at
> > > >
> > > >
> > >
> >
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> > > >         at java.lang.Thread.run(Thread.java:662)
> > > > 2013-02-20 15:40:17,498 WARN org.apache.hadoop.ipc.HBaseServer:
Unable
> > to
> > > > read call parameters for client ***
> > > > java.io.IOException: Error in readFields
> > > >         at
> > > >
> > > >
> > >
> >
org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:655)
> > > >         at
> > > >
org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:125)
org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:1254)
org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1183)
org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:719)
org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:511)
org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:486)
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
java.io.DataInputStream.readFully(DataInputStream.java:178)
java.io.DataInputStream.readFully(DataInputStream.java:152)
com.cerner.kepler.filters.RowRangeFilter.readFields(RowRangeFilter.java:226)
org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB