Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
HBase >> mail # user >> BigDecimalColumnInterpreter


+
Julian Wissmann 2012-09-05, 16:17
+
Ted Yu 2012-09-05, 16:22
+
Julian Wissmann 2012-09-05, 19:07
+
Ted Yu 2012-09-05, 19:22
+
Julian Wissmann 2012-09-05, 19:49
+
Ted Yu 2012-09-05, 20:04
+
Julian Wissmann 2012-09-05, 20:30
+
anil gupta 2012-09-05, 21:04
+
anil gupta 2012-09-05, 21:27
+
Julian Wissmann 2012-09-06, 09:28
+
anil gupta 2012-09-06, 17:19
+
anil gupta 2012-09-06, 17:24
+
Julian Wissmann 2012-09-06, 18:43
+
anil gupta 2012-09-06, 19:22
+
Julian Wissmann 2012-09-10, 16:36
Copy link to this message
-
Re: BigDecimalColumnInterpreter
Hi Julian,

I am using only cdh4 libraries. I use the jars present under hadoop and
hbase installed dir. In my last email i gave you some more pointers. Try to
follow them and see what happens.
If then also it doesn't works for you, then i will try to write an utility
to test BigDecimalColumnInterpreter on your setup also.

Thanks,
Anil

On Mon, Sep 10, 2012 at 9:36 AM, Julian Wissmann
<[EMAIL PROTECTED]>wrote:

> Hi,
>
> I haven't really gotten to working on this, since last wednesday.
> Checked readFields() and write() today, but don't really see, why I would
> need to reimplement those. Admittedly I'm not that into the whole Hbase
> codebase, yet, so there is a good chance I'm missing something, here.
>
> Also, Anil, what hbase library are you coding this against?
> It does seem like madness, that even though, we're both using this
> identically it does not work for me.
>
> Cheers,
>
> Julian
>
> 2012/9/6 anil gupta <[EMAIL PROTECTED]>
>
> > Yes, we do. :)
> > Let me know the outcome. If you look at the BD ColumnInterpreter,
> getValue
> > method is converting the byte array into BigDecimal. So you should not
> have
> > any problem. The BD ColumnInterpreter is pretty similar to
> > LongColumnInterpreter.
> >
> > Here is the code snippet for getValue() method which will convert Byte[]
> to
> > BigDecimal:
> >
> >         @Override
> >         public BigDecimal getValue(byte[] paramArrayOfByte1, byte[]
> > paramArrayOfByte2,
> >                         KeyValue kv) throws IOException {
> >                  if ((kv == null || kv.getValue() == null))
> >                                return null;
> >                              return Bytes.toBigDecimal(kv.getValue());
> >         }
> >
> > Thanks,
> > Anil
> >
> >
> > On Thu, Sep 6, 2012 at 11:43 AM, Julian Wissmann
> > <[EMAIL PROTECTED]>wrote:
> >
> > > 0.92.1 from cdh4. I assume we use the same thing.
> > >
> > > 2012/9/6 anil gupta <[EMAIL PROTECTED]>
> > >
> > > > I am using HBase0.92.1. Which version you are using?
> > > >
> > > >
> > > > On Thu, Sep 6, 2012 at 10:19 AM, anil gupta <[EMAIL PROTECTED]>
> > > wrote:
> > > >
> > > > > Hi Julian,
> > > > >
> > > > > You need to add the column qualifier explicitly in the scanner. You
> > > have
> > > > > only added the column family in the scanner.
> > > > > I am also assuming that you are writing a ByteArray of BigDecimal
> > > object
> > > > > as value of these cells in HBase. Is that right?
> > > > >
> > > > > Thanks,
> > > > > Anil
> > > > >
> > > > >
> > > > > On Thu, Sep 6, 2012 at 2:28 AM, Julian Wissmann <
> > > > [EMAIL PROTECTED]>wrote:
> > > > >
> > > > >> Hi, anil,
> > > > >>
> > > > >> I presume you mean something like this:
> > > > >>         Scan scan = new Scan(_start, _end);
> > > > >>         scan.addFamily(family.getBytes());
> > > > >>         final ColumnInterpreter<BigDecimal, BigDecimal> ci = new
> > > > >> mypackage.BigDecimalColumnInterpreter();
> > > > >>         AggregationClient ag = new org.apache.hadoop.hbase.
> > > > >> client.coprocessor.AggregationClient(config);
> > > > >>         BigDecimal sum = ag.sum(Bytes.toBytes(tableName), new
> > > > >> BigDecimalColumnInterpreter(), scan);
> > > > >>
> > > > >>
> > > > >> When I call this,with the Endpoint in place and loaded as a jar, I
> > get
> > > > the
> > > > >> above error.
> > > > >> When I call it without the endpoint loaded as coprocessor,
> though, I
> > > get
> > > > >> this:
> > > > >>
> > > > >> java.util.concurrent.ExecutionException:
> > > > >> org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed
> > after
> > > > >> attempts=10, exceptions:
> > > > >> Thu Sep 06 11:07:39 CEST 2012,
> > > > >> org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@7bd6747b,
> > > > >> java.io.IOException:
> > > > >> IPC server unable to read call parameters: Error in readFields
> > > > >> Thu Sep 06 11:07:40 CEST 2012,
> > > > >> org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@7bd6747b,
> > > > >> java.io.IOException:

Thanks & Regards,
Anil Gupta
+
Julian Wissmann 2012-09-12, 16:27
+
Ted Yu 2012-09-12, 16:45
+
Julian Wissmann 2012-09-12, 19:56
+
anil gupta 2012-09-14, 07:56
+
anil gupta 2012-09-14, 19:27
+
anil gupta 2012-09-14, 22:17
+
Ted Yu 2012-09-15, 14:56
+
anil gupta 2012-09-15, 15:26
+
Julian Wissmann 2012-10-03, 16:26