Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> Zero rows imported while doing Mysql to Hive import


Copy link to this message
-
Re: Zero rows imported while doing Mysql to Hive import
Hi,

While looking into Hive history file, I found this query.

LOAD DATA INPATH 'hdfs://localhost:9000/user/root/Customers' INTO
TABLE `Customers`"
QUERY_ID="root_20130703050909_882c2484-e1c8-43a3-9eff-dd0f296fc560"
.....

HDFS location mentioned in this query is a directory not a csv file.
This directory contains the part-* file(s) which hold actual data. I
don't know if Sqoop understands this directory structure and knows how
to read those multiple part-* files? Or is this an issue?

I was hit by a similar thing while creating an external table in Hive
where location specified was such hdfs directory (generated by sqoop
import) containing multiple part-* files. Hive table got created but
all the rows were NULL. And thats why I started looking into
--hive-import option available in sqoop. But looks like it is also not
working for me.

Am I missing something?
Thanks,
Siddharth

On Wed, Jul 3, 2013 at 4:55 PM, Siddharth Karandikar
<[EMAIL PROTECTED]> wrote:
> Hi,
>
> I am facing some problems while importing a sample database from MySQL
> to Hive using Sqoop 1.4.3, Hive 0.11.0 and Hadoop 1.1.2 on a single
> node setup.
>
> While doing this, I am always seeing following message in job logs -
> Table default.customers stats: [num_partitions: 0, num_files: 2,
> num_rows: 0, total_size: 15556, raw_data_size: 0]
>
> Job ends with success message -
> 13/07/03 05:09:30 INFO hive.HiveImport: Time taken: 0.74 seconds
> 13/07/03 05:09:30 INFO hive.HiveImport: Hive import complete.
> 13/07/03 05:09:30 INFO hive.HiveImport: Export directory is empty, removing it.
>
> Full command and log can be found at - http://pastebin.com/03f6Wdga
>
> I am using Sqoop for the first time and I could be missing few things.
> Any pointers to solve thos problem would really help.
>
>
> MySQL to HDFS is working fine though.
>
>
> Thanks,
> Siddharth
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB