Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive >> mail # user >> Hive upload


Copy link to this message
-
RE: Hive upload

Hi Bejoy,
Thank you very much for your response,
1)
A) When I run command  show tables it doesn't show  newhive table.B) Yes the the newhive directory is present into /user/hive/warehouse and also containing the values imported from RDBMS
Please suggest and give me an example for the sqoop import command according to you for this case.

2)
A) Here is the command  

describe formatted letstry;
OK
# col_name                data_type               comment            
          
rollno                  int                     None                
name                    string                  None                
numbr                   int                     None                
sno                     int                     None                
          
# Detailed Table Information          
Database:               default                  
Owner:                  mediaadmin              
CreateTime:             Tue Jul 03 17:06:27 GMT+05:30 2012    
LastAccessTime:         UNKNOWN                  
Protect Mode:           None                    
Retention:              0                        
Location:               hdfs://localhost:9000/user/hive/warehouse/letstry    
Table Type:             MANAGED_TABLE            
Table Parameters:          
    transient_lastDdlTime    1341315550          
          
# Storage Information          
SerDe Library:          org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe    
InputFormat:            org.apache.hadoop.mapred.TextInputFormat    
OutputFormat:           org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat    
Compressed:             No                      
Num Buckets:            -1                      
Bucket Columns:         []                      
Sort Columns:           []                      
Storage Desc Params:          
    serialization.format    1                  
Time taken: 0.101 seconds
B) hadoop dfs -ls /user/hive/warehouse/letstry/
Found 1 items
-rw-r--r--   1 mediaadmin supergroup         17 2012-07-02 12:05 /user/hive/warehouse/letstry/part-m-00000

hadoop dfs -cat /user/hive/warehouse/letstry/part-m-00000
1,John,123,abc,2

Here data is present but when I upload it to Hive it gets deleted from HDFS and in Hive value appers NULL instead of  ( 1,John,123,abc,2). and I didn't understad your point regarding correct data format? ( this data was imported from Mysql table)And what kind of confugration neede in sqoop
Please suggest and help

GreetingsYogesh Kumar

Subject: Re: Hive upload
To: [EMAIL PROTECTED]
From: [EMAIL PROTECTED]
Date: Wed, 4 Jul 2012 05:58:41 +0000
Hi Yogesh

The first issue (sqoop one).
1) Is the table newhive coming when you list tables using 'show table'?
2) Are you seeing a directory 'newhive' in your hive warte house dir(usually /usr/hive/warehouse)?

If not sqoop is failing to create hive tables /load data into them. Only sqoop import to hdfs is getting successful the hive part is failing.

If hive in stand alone mode works as desired you need to check the sqoop configurations.

Regarding the second issue, can you check the storage location of NewTable and check whether there are files within. If so then do a 'cat' of those files and see whether it has the correct data format.

You can get the location of your table from the following command
describe formatted NewTable;
Regards
Bejoy KS

Sent from handheld, please excuse typos.From:  yogesh dhari <[EMAIL PROTECTED]>
Date: Wed, 4 Jul 2012 11:09:02 +0530To: hive request<[EMAIL PROTECTED]>ReplyTo:  [EMAIL PROTECTED]
Subject: Hive upload

Hi all,
I am trying to upload the tables from RDBMS to hive through sqoop, hive imports successfully. but i didn't find any table in hive that imported table gets uploaded into hdfs idr /user/hive/warehouseI want it to be present into hive, I used this command
sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table newone --hive-table newhive --create-hive-table --hive-import --target-dir /user/hive/warehouse/new

And another thing is,If I upload any file or table from HDFS or from Local then its uploads but data doesn't show in Hive table,
If I run command Select * from NewTable;it reflects
Null     Null     Null    Null

although the real data is
Yogesh    4    Bangalore   1234

Please Suggest and help
RegardsYogesh Kumar    
     
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB