Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase, mail # user - Custom HBase Filter : Error in readFields


Copy link to this message
-
Re: Custom HBase Filter : Error in readFields
Bryan Baugher 2013-02-21, 13:06
Logged this issue[1] to have the change made.

[1] - https://issues.apache.org/jira/browse/HBASE-7894
On Thu, Feb 21, 2013 at 12:13 AM, lars hofhansl <[EMAIL PROTECTED]> wrote:

> This is not really obvious. I think we might want to add something to the
> Javadoc to that extend.
>
> Otherwise we'll see what you've seen, namely that it works fine when you
> test with a single region server, but not if the filter is serialized to
> multiple region servers.
>
> -- Lars
>
>
>
> ________________________________
>  From: Bryan Baugher <[EMAIL PROTECTED]>
> To: user <[EMAIL PROTECTED]>; lars hofhansl <[EMAIL PROTECTED]>
> Sent: Wednesday, February 20, 2013 9:09 PM
> Subject: Re: Custom HBase Filter : Error in readFields
>
> Ugh, yes you are correct. This fixed my issue. Thank you all for your help.
>
>
> On Wed, Feb 20, 2013 at 10:54 PM, lars hofhansl <[EMAIL PROTECTED]> wrote:
>
> > You probably want your write() to be idempotent. Currently it will
> exhaust
> > the iterator and not reset it.
> > (Just guessing, though)
> >
> >
> >
> > ________________________________
> >  From: Bryan Baugher <[EMAIL PROTECTED]>
> > To: user <[EMAIL PROTECTED]>
> > Sent: Wednesday, February 20, 2013 7:46 PM
> > Subject: Re: Custom HBase Filter : Error in readFields
> >
> > I updated my code to use the Bytes class for serialization and added more
> > log messages. I see this[1] now. It is able to create the filter the
> first
> > time but when it gets to the second region (on the same region server) it
> > attempts to create the filter again but the data read in from readFields
> > seems corrupted.
> >
> > [1] - http://pastebin.com/TqNsUVSk
> >
> >
> > On Wed, Feb 20, 2013 at 8:48 PM, Ted Yu <[EMAIL PROTECTED]> wrote:
> >
> > > Can you use code similar to the following for serialization ?
> > >   public void readFields(DataInput in) throws IOException {
> > >     this.prefix = Bytes.readByteArray(in);
> > >   }
> > >
> > > See src/main/java/org/apache/hadoop/hbase/filter/PrefixFilter.java
> > >
> > > Thanks
> > >
> > > On Wed, Feb 20, 2013 at 5:58 PM, Bryan Baugher <[EMAIL PROTECTED]>
> wrote:
> > >
> > > > Here[1] is the code for the filter.
> > > >
> > > > -Bryan
> > > >
> > > > [1] - http://pastebin.com/5Qjas88z
> > > >
> > > > > Bryan:
> > > > > Looks like you may have missed adding unit test for your filter.
> > > > >
> > > > > Unit test should have caught this situation much earlier.
> > > > >
> > > > > Cheers
> > > > >
> > > > > On Wed, Feb 20, 2013 at 3:42 PM, Viral Bajaria <
> > > [EMAIL PROTECTED]
> > > > >wrote:
> > > > >
> > > > > > Also the readFields is your implementation of how to read the
> byte
> > > > array
> > > > > > transferred from the client. So I think there has to be some
> issue
> > in
> > > > how
> > > > > > you write the byte array to the network and what you are reading
> > out
> > > of
> > > > > > that i.e. the size of arrays might not be identical.
> > > > > >
> > > > > > But as Ted mentioned, looking at the code will help troubleshoot
> it
> > > > better.
> > > > > >
> > > > > > On Wed, Feb 20, 2013 at 3:32 PM, Ted Yu <[EMAIL PROTECTED]>
> > wrote:
> > > > > >
> > > > > > > If you show us the code for RowRangeFilter, that would help us
> > > > > > > troubleshoot.
> > > > > > >
> > > > > > > Cheers
> > > > > > >
> > > > > > > On Wed, Feb 20, 2013 at 2:05 PM, Bryan Baugher <
> [EMAIL PROTECTED]
> > >
> > > > wrote:
> > > > > > >
> > > > > > > > Hi everyone,
> > > > > > > >
> > > > > > > > I am trying to write my own custom Filter but I have been
> > having
> > > > > > issues.
> > > > > > > > When there is only 1 region in my table the scan works as
> > > expected
> > > > but
> > > > > > > when
> > > > > > > > there is more, it attempts to create a new version of my
> filter
> > > and
> > > > > > > > deserialize the information again but the data seems to be
> > gone.
> > > I
> > > > am
> > > > > > > > running HBase 0.92.1-cdh4.1.1.
> > > > > > > >
> > > > > > > > 2013-02-20 15:39:53,220 DEBUG
> > > > com.cerner.kepler.filters.RowRangeFilter:
-Bryan