Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Sqoop >> mail # user >> hive-import using avro


Copy link to this message
-
hive-import using avro
Hi,

While looking into mysql import error that I was facing (which was actually
my error), I came across following issue on jira -
https://issues.apache.org/jira/browse/SQOOP-324
This is about using avro file(s) instead of plain-text file(s) as
intermediate file while doing hive imports.

I can think about couple of scenarios where avro might help (but not very
sure) - binary data in database columns, data containing delimiters used
for field/line separators etc.

Has anyone tried implementing this? Any particular case that is solved by
using avro as intermediate format?
Thanks,
Siddharth
+
Jarek Jarcec Cecho 2013-07-08, 15:28
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB