-Re: Mapping HBase table in Hive
Ibrahim Yakti 2013-01-13, 11:33
Seems it worked, in the mapping of the column family I used ":key" and
that's it, in addition as per some articles there should be no spaces in
mapping, below is the create table sample:
CREATE EXTERNAL TABLE hbase_orders(id bigint, value bigint, date_lastchange
> string, date_inserted string) STORED BY
> 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES
> ("hbase.columns.mapping" > ":key,date_inserted:bigint,date_lastchange:string,value:string")
> TBLPROPERTIES ("hbase.table.name" = "orders");
On Sun, Jan 13, 2013 at 12:23 PM, <[EMAIL PROTECTED]> wrote:
> Hi Ibrahim.
> SQOOP is used to import data from rdbms to hbase in your case.
> Please get the schema from hbase for your corresponding table and post it
> We can point out how your mapping could be.
> Bejoy KS
> Sent from remote device, Please excuse typos
> *From: * Ibrahim Yakti <[EMAIL PROTECTED]>
> *Date: *Sun, 13 Jan 2013 11:22:51 +0300
> *To: *user<[EMAIL PROTECTED]>
> *ReplyTo: * [EMAIL PROTECTED]
> *Subject: *Re: Mapping HBase table in Hive
> Thanks Bejoy,
> what do you mean by:
>> If you need to map a full CF to a hive column, the data type of the hive
>> column should be a Map.
> suppose I used sqoop to move data from mysql to hbase and used id as a
> column family, all the other columns will be QF then, right?
> The integration document is not clear, I think it needs more clarification
> or maybe I am still missing something.
> On Tue, Jan 8, 2013 at 9:35 PM, <[EMAIL PROTECTED]> wrote:
>> data type of