Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive, mail # user - Hive uploading


Copy link to this message
-
RE: Hive uploading
yogesh.kumar13@... 2012-07-05, 12:37
Hi Bejoy,

I have created new table called Troy  and for hive its troyhive, as it was showing Outputdirectory already exists
sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table troy --hive-table troyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose
12/07/05 17:57:16 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 17:57:16 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 17:57:16 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 17:57:16 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 17:57:16 INFO tool.CodeGenTool: Beginning code generation
12/07/05 17:57:17 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `troy` AS t LIMIT 1
12/07/05 17:57:17 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 17:57:17 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/26f5861253b910681eade0bd0e84efb5/troy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 17:57:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/26f5861253b910681eade0bd0e84efb5/troy.jar
12/07/05 17:57:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 17:57:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 17:57:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 17:57:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 17:57:17 INFO mapreduce.ImportJobBase: Beginning import of troy
12/07/05 17:57:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`num`), MAX(`num`) FROM `troy`
12/07/05 17:57:18 INFO mapred.JobClient: Running job: job_201207051104_0005
12/07/05 17:57:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 17:57:30 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 17:57:32 INFO mapred.JobClient: Job complete: job_201207051104_0005
12/07/05 17:57:32 INFO mapred.JobClient: Counters: 5
12/07/05 17:57:32 INFO mapred.JobClient:   Job Counters
12/07/05 17:57:32 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 17:57:32 INFO mapred.JobClient:   FileSystemCounters
12/07/05 17:57:32 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 17:57:32 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 17:57:32 INFO mapred.JobClient:     Map input records=1
12/07/05 17:57:32 INFO mapred.JobClient:     Spilled Records=0
12/07/05 17:57:32 INFO mapred.JobClient:     Map output records=1
12/07/05 17:57:32 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 14.6895 seconds (0.5446 bytes/sec)
12/07/05 17:57:32 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 17:57:32 INFO hive.HiveImport: Removing temporary files from import process: troy/_logs
12/07/05 17:57:32 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 17:57:32 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `troy` AS t LIMIT 1
12/07/05 17:57:34 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 17:57:34 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051757_1184599996.txt
12/07/05 17:57:39 INFO hive.HiveImport: OK
12/07/05 17:57:39 INFO hive.HiveImport: Time taken: 4.249 seconds
12/07/05 17:57:39 INFO hive.HiveImport: Loading data to table default.troyhive
12/07/05 17:57:39 INFO hive.HiveImport: OK
12/07/05 17:57:39 INFO hive.HiveImport: Time taken: 0.257 seconds
12/07/05 17:57:39 INFO hive.HiveImport: Hive import complete.

Regards
Yogesh Kumar

________________________________
From: Bejoy Ks [[EMAIL PROTECTED]]
Sent: Thursday, July 05, 2012 6:03 PM
To: [EMAIL PROTECTED]
Subject: Re: Hive uploading

Hi Yogesh

Verbose option won't create any difference in operation, but gives more logging information on console which could be helpful to search for any hints.

So please post in your console dump/log along with the sqoop import command with verbose enabled.

Regards
Bejoy KS

________________________________
From: "[EMAIL PROTECTED]" <[EMAIL PROTECTED]>
To: [EMAIL PROTECTED]; [EMAIL PROTECTED]
Sent: Thursday, July 5, 2012 6:00 PM
Subject: RE: Hive uploading

Hello Bejoy,

sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose

Still the same, no table has been created. I am not able to see the dummyhive table in hive by using command
Show Tables ;

although table dummyhive created into HDFS in dir:
user/hive/warehouse/dummyhive
Please suggest
Yogesh Kumar

________________________________
From: Bejoy Ks [[EMAIL PROTECTED]]
Sent: Thursday, July 05, 2012 5:29 PM
To: [EMAIL PROTECTED]
Subject: Re: Hive uploading

Hi Yogesh

Please try out this command

 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose
Regards
Bejoy KS

________________________________
From: "[EMAIL PROTECTED]" <[EMAIL PROTECTED]>
To: [EMAIL PROTECTED]; [EMAIL PROTECTED]
Sent: Thursday, July 5, 2012 5:03 PM
Subject: RE: Hive uploading

Hi Bejoy

I have confirmed hive installation its same for both
I used command echo $HIVE_HOME on both sqoop terminal and hive terminal
both result the same Path
HADOOP/hive

I am new to Hive and sqoop, would you please give an example using -verbose option with this command
 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --pass