Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase, mail # user - BigDecimalColumnInterpreter


Copy link to this message
-
Re: BigDecimalColumnInterpreter
Ted Yu 2012-09-05, 19:22
And your HBase version is ?

Since you use Double.parseDouble(), looks like it would be more efficient
to develop DoubleColumnInterpreter.

On Wed, Sep 5, 2012 at 12:07 PM, Julian Wissmann
<[EMAIL PROTECTED]>wrote:

> Hi,
> the schema looks like this:
> RowKey: id,timerange_timestamp,offset (String)
> Qualifier: Offset (long)
> Timestamp: timestamp (long)
> Value:number (BigDecimal)
>
> Or as code when I read data from csv:byte[] value > Bytes.toBytes(BigDecimal.valueOf(Double.parseDouble(cData[2])));
>
> Cheers,
>
> Julian
>
> 2012/9/5 Ted Yu <[EMAIL PROTECTED]>
>
> > You haven't told us the schema of your table yet.
> > Your table should have column whose value can be interpreted by
> > BigDecimalColumnInterpreter.
> >
> > Cheers
> >
> > On Wed, Sep 5, 2012 at 9:17 AM, Julian Wissmann <
> [EMAIL PROTECTED]
> > >wrote:
> >
> > > Hi,
> > >
> > > I am currently experimenting with the BigDecimalColumnInterpreter from
> > > https://issues.apache.org/jira/browse/HBASE-6669.
> > >
> > > I was thinking the best way for me to work with it would be to use the
> > Java
> > > class and just use that as is.
> > >
> > > Imported it into my project and tried to work with it as is, by just
> > > instantiating the ColumnInterpreter as BigDecimalColumnInterpreter.
> Okay,
> > > threw errors and also complained about not knowing where to find such a
> > > class.
> > >
> > > So I did some reading and found out, that I'd need to have an Endpoint
> > for
> > > it. So I imported AggregateImplementation and AggregateProtocol into my
> > > workspace, renamed them, and refactored them where necessary to take
> > > BigDecimal. Re-exported the jar, then and had another try.
> > >
> > > So when I call:
> > > ------
> > > final Scan scan = new Scan((metricID + "," +
> basetime_begin).getBytes(),
> > > (metricID + "," + basetime_end).getBytes());
> > > scan.addFamily(family.getBytes());
> > > final ColumnInterpreter<BigDecimal, BigDecimal> ci = new
> > > BigDecimalColumnInterpreter();
> > > Map<byte[], BigDecimal> results > > > table.coprocessorExec(BigDecimalProtocol.class, null, null,
> > >     new Batch.Call<BigDecimalProtocol,BigDecimal>() {
> > >       public BigDecimal call(BigDecimalProtocol instance)throws
> > > IOException{
> > >         return instance.getMax(ci, scan);
> > >       }
> > >     });
> > > ------
> > > I get errors in the log again, that it can't find
> > > BigDecimalColumnInterpreter... okay, so I tried
> > > ------
> > > Scan scan = new Scan((metricID + "," + basetime_begin).getBytes(),
> > > (metricID + "," + basetime_end).getBytes());
> > > scan.addFamily(family.getBytes());
> > > final ColumnInterpreter<BigDecimal, BigDecimal> ci = new
> > > BigDecimalColumnInterpreter();
> > > AggregationClient ag = new AggregationClient(config);
> > > BigDecimal max = ag.max(Bytes.toBytes(tableName), ci, scan);
> > > ------
> > > I don't get errors recored in the log anymore, but a load of Java error
> > > output:
> > > ------
> > >
> > > java.util.concurrent.ExecutionException:
> > > org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after
> > > attempts=10, exceptions:
> > > Wed Sep 05 18:13:43 CEST 2012,
> > > org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@50502819,
> > > java.io.IOException:
> > > IPC server unable to read call parameters: Error in readFields
> > > Wed Sep 05 18:13:44 CEST 2012,
> > > org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@50502819,
> > > java.io.IOException:
> > > IPC server unable to read call parameters: Error in readFields
> > > Wed Sep 05 18:13:45 CEST 2012,
> > > org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@50502819,
> > > java.io.IOException:
> > > IPC server unable to read call parameters: Error in readFields
> > > Wed Sep 05 18:13:46 CEST 2012,
> > > org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@50502819,
> > > java.io.IOException:
> > > IPC server unable to read call parameters: Error in readFields
> > > Wed Sep 05 18:13:49 CEST 2012,
> > > org.apache.hadoop.hbase.ipc.ExecRPCInvoker$1@50502819,