-RE: Hive upload
yogesh dhari 2012-07-04, 08:10
Thank you very much for your response,
A) When I run command show tables it doesn't show newhive table.B) Yes the the newhive directory is present into /user/hive/warehouse and also containing the values imported from RDBMS
Please suggest and give me an example for the sqoop import command according to you for this case.
A) Here is the command
describe formatted letstry;
# col_name data_type comment
rollno int None
name string None
numbr int None
sno int None
# Detailed Table Information
CreateTime: Tue Jul 03 17:06:27 GMT+05:30 2012
Protect Mode: None
Table Type: MANAGED_TABLE
# Storage Information
SerDe Library: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
Num Buckets: -1
Bucket Columns: 
Sort Columns: 
Storage Desc Params:
Time taken: 0.101 seconds
B) hadoop dfs -ls /user/hive/warehouse/letstry/
Found 1 items
-rw-r--r-- 1 mediaadmin supergroup 17 2012-07-02 12:05 /user/hive/warehouse/letstry/part-m-00000
hadoop dfs -cat /user/hive/warehouse/letstry/part-m-00000
Here data is present but when I upload it to Hive it gets deleted from HDFS and in Hive value appers NULL instead of ( 1,John,123,abc,2). and I didn't understad your point regarding correct data format? ( this data was imported from Mysql table)And what kind of confugration neede in sqoop
Please suggest and help
Subject: Re: Hive upload
To: [EMAIL PROTECTED]
From: [EMAIL PROTECTED]
Date: Wed, 4 Jul 2012 05:58:41 +0000
The first issue (sqoop one).
1) Is the table newhive coming when you list tables using 'show table'?
2) Are you seeing a directory 'newhive' in your hive warte house dir(usually /usr/hive/warehouse)?
If not sqoop is failing to create hive tables /load data into them. Only sqoop import to hdfs is getting successful the hive part is failing.
If hive in stand alone mode works as desired you need to check the sqoop configurations.
Regarding the second issue, can you check the storage location of NewTable and check whether there are files within. If so then do a 'cat' of those files and see whether it has the correct data format.
You can get the location of your table from the following command
describe formatted NewTable;
Sent from handheld, please excuse typos.From: yogesh dhari <[EMAIL PROTECTED]>
Date: Wed, 4 Jul 2012 11:09:02 +0530To: hive request<[EMAIL PROTECTED]>ReplyTo: [EMAIL PROTECTED]
Subject: Hive upload
I am trying to upload the tables from RDBMS to hive through sqoop, hive imports successfully. but i didn't find any table in hive that imported table gets uploaded into hdfs idr /user/hive/warehouseI want it to be present into hive, I used this command
sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table newone --hive-table newhive --create-hive-table --hive-import --target-dir /user/hive/warehouse/new
And another thing is,If I upload any file or table from HDFS or from Local then its uploads but data doesn't show in Hive table,
If I run command Select * from NewTable;it reflects
Null Null Null Null
although the real data is
Yogesh 4 Bangalore 1234
Please Suggest and help