David Koch 2012-12-06, 18:56
kulkarni.swarnim@...) 2012-12-06, 20:10
David Koch 2012-12-06, 20:23
-Re: Mapping existing HBase table with many columns to Hive.
David Koch 2012-12-10, 00:03
I tried the shell command which Swarnim kindly provided and it allows me to
map an existing HBase table into Hive. However, since my qualifiers are
long but map only accepts string as a key, the result is garbled. Even with
the suggested patch which allows binary keys, the resulting datatype in
Hive would not be long but binary, making it hard to query from shell. It
seems there is no API for now, right?
Currently, is there any way to map HBase byte to Hive datatypes?
The assumption is, that all byte were generated using Hadoop's
Byte.toBytes(<type>) method and that either all row keys, qualifiers and
values share the same data type respectively (for example: row keys are
ints, qualifiers are longs and values are strings).
On Thu, Dec 6, 2012 at 9:23 PM, David Koch <[EMAIL PROTECTED]> wrote:
> Hello Swarnim,
> Thank you for your answer. I will try the options you pointed out.
> On Thu, Dec 6, 2012 at 9:10 PM, [EMAIL PROTECTED] <
> [EMAIL PROTECTED]> wrote:
kulkarni.swarnim@...) 2012-12-10, 00:52