Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> BigDecimalColumnInterpreter


Copy link to this message
-
Re: BigDecimalColumnInterpreter
Thanks for digging, Julian.

Looks like we need to support BigDecimal in HbaseObjectWritable

Actually once a test is written for BigDecimalColumnInterpreter, it would
become much easier for anyone to debug this issue.

On Wed, Sep 12, 2012 at 9:27 AM, Julian Wissmann
<[EMAIL PROTECTED]>wrote:

> Hi,
>
> so I'm slowly getting an overview of the code, here. I haven't really
> understood the problem yet, though.
>
> DataInput and DataOutput cannot handle BigDecimal, which seems to be
> somehwere close to the root cause of the problem.
> The error is being triggered in HBaseServer on line
> 1642         param.readFields(dis);
> which calls org.apache.hadoop.io.writable, which implements write and
> readFields(), and I assume is being implemented by
> HbaseObjectWritable#readFields.
> In HbaseObjectWritable#readObject DataInput in then gets checked for being
> a primitive data type  and read accordingly.
>
> Now if I interpret Bytes#valueOf() correctly, it just takes a BigDecimal
> value and converts _just_ the value to byte[] and not the whole object. So
> what readObject finds here, should be interpreted as byte[] and happily
> passed on. The first method, that should even care about parsing this to
> BigDecimal would then be BigDecimalColumnInterpreter#getValue()
>
> To test this, I decided to overwrite write and readFields, as I inherit
> them from Exec, anyway, however I have no understanding of how these
> methods work.
> I put in a few printlns to get a feeling for it, but turns out it is never
> even beling called, at all.
>
>
> 2012/9/10 anil gupta <[EMAIL PROTECTED]>
>
> > Hi Julian,
> >
> > I am using only cdh4 libraries. I use the jars present under hadoop and
> > hbase installed dir. In my last email i gave you some more pointers. Try
> to
> > follow them and see what happens.
> > If then also it doesn't works for you, then i will try to write an
> utility
> > to test BigDecimalColumnInterpreter on your setup also.
> >
> > Thanks,
> > Anil
> >
> > On Mon, Sep 10, 2012 at 9:36 AM, Julian Wissmann
> > <[EMAIL PROTECTED]>wrote:
> >
> > > Hi,
> > >
> > > I haven't really gotten to working on this, since last wednesday.
> > > Checked readFields() and write() today, but don't really see, why I
> would
> > > need to reimplement those. Admittedly I'm not that into the whole Hbase
> > > codebase, yet, so there is a good chance I'm missing something, here.
> > >
> > > Also, Anil, what hbase library are you coding this against?
> > > It does seem like madness, that even though, we're both using this
> > > identically it does not work for me.
> > >
> > > Cheers,
> > >
> > > Julian
> > >
> > > 2012/9/6 anil gupta <[EMAIL PROTECTED]>
> > >
> > > > Yes, we do. :)
> > > > Let me know the outcome. If you look at the BD ColumnInterpreter,
> > > getValue
> > > > method is converting the byte array into BigDecimal. So you should
> not
> > > have
> > > > any problem. The BD ColumnInterpreter is pretty similar to
> > > > LongColumnInterpreter.
> > > >
> > > > Here is the code snippet for getValue() method which will convert
> > Byte[]
> > > to
> > > > BigDecimal:
> > > >
> > > >         @Override
> > > >         public BigDecimal getValue(byte[] paramArrayOfByte1, byte[]
> > > > paramArrayOfByte2,
> > > >                         KeyValue kv) throws IOException {
> > > >                  if ((kv == null || kv.getValue() == null))
> > > >                                return null;
> > > >                              return
> Bytes.toBigDecimal(kv.getValue());
> > > >         }
> > > >
> > > > Thanks,
> > > > Anil
> > > >
> > > >
> > > > On Thu, Sep 6, 2012 at 11:43 AM, Julian Wissmann
> > > > <[EMAIL PROTECTED]>wrote:
> > > >
> > > > > 0.92.1 from cdh4. I assume we use the same thing.
> > > > >
> > > > > 2012/9/6 anil gupta <[EMAIL PROTECTED]>
> > > > >
> > > > > > I am using HBase0.92.1. Which version you are using?
> > > > > >
> > > > > >
> > > > > > On Thu, Sep 6, 2012 at 10:19 AM, anil gupta <
> [EMAIL PROTECTED]
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB