Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> Sqoop Halts - Exporting from HDFS to Oracle (Date/Timestamp issue?)


Copy link to this message
-
Sqoop Halts - Exporting from HDFS to Oracle (Date/Timestamp issue?)
I'm encountering an issue when exporting data from HDFS to Oracle. I'm not sure whether this is an existing issue or not. When I carefully checked the log, I saw one of the column has "2/13/2013 4:35:50" wheres the corresponding column type in Oracle is varchar2.

Can't we export date or time just as string? When my HDFS has that kind of data, Sqoop halts and at the end its(task) killed by mapreduce task's timeout.
Based on this: http://qnalist.com/questions/31561/sqoop-modifies-the-date-format-in-the-exported-data, I'm getting impression that Oracle driver converts it to timestamp and tries to insert it and when it doesn't see timestamp as column type it halts. Please correct me if I'm wrong.
Is there any way not to do that kind of conversion? and just export them as string?
Thanks in advance.

Sincerely,Tanzir    
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB