But how to achieve dynamic partitioning. For each row in mysql date from column get partition name and insert in corresponding partition in hive. Sqoop requires partition t be told beforehand. On Fri, May 2, 2014 at 8:36 AM, unmesha sreeveni <[EMAIL PROTECTED]>wrote:
Sqoop also support dynamic partitioning. I have done that. For that you have to enable dynamic partition i.e dynamic partition = true, in hive. On Fri, May 2, 2014 at 12:57 PM, unmesha sreeveni <[EMAIL PROTECTED]>wrote:
It sounds like you might need to export. Via sqoop using a query or view, as the date granularity in your MySQL table is different from the desired Hive table. The overall performance may be lower as MySQL must do more than just read rows from disk, but you may still find ways to get the data in parallel through Sqoop.
for that do i need to load files first in non partitioned table and then in from there to partitioned table use insert from unpartitioned table to partitioned one. On Fri, May 2, 2014 at 4:04 PM, Hamza Asad <[EMAIL PROTECTED]> wrote:
I am new in hive and here is my idea? 1. Use mysqldump to dump your data to csv file. 2. Load csv to hive temp table. 3. Create partition table. 4. Use dynamic partition, select from temp table to insert to partition table. You can use udf to get the date from the timestamp.
Apache Lucene, Apache Solr and all other Apache Software Foundation project and their respective logos are trademarks of the Apache Software Foundation.
Elasticsearch, Kibana, Logstash, and Beats are trademarks of Elasticsearch BV, registered in the U.S. and in other countries. This site and Sematext Group is in no way affiliated with Elasticsearch BV.
Service operated by Sematext