Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> Escaping default hive delimiters on sqoop import


Copy link to this message
-
Escaping default hive delimiters on sqoop import
Hi All,

I want to know if there any way where I can escape default hive delimiters (row delimiters : \n or \r, column delimiters : \01) while sqoop importing.
There are some columns in oracle db which contains these chars as part of string field.
Neither I want to delete them using --hive-drop-import-delims  nor replace them using --hive-delims-replacement.

Please let me know if there is a way to escape them.

My current sqoop version is 1.4.3

Regards
-----------------------
Vikash T
+1 (408)506 2024

NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB