Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> hive-import using avro


Copy link to this message
-
Re: hive-import using avro
Hi Siddharth,
I'm not aware of anyone who is currently working on SQOOP-324. Please do not hesitate and pick that JIRA up if you are interesting. Contributions are welcome!

Jarcec

On Fri, Jul 05, 2013 at 07:48:11PM +0530, Siddharth Karandikar wrote:
> Hi,
>
> While looking into mysql import error that I was facing (which was actually
> my error), I came across following issue on jira -
> https://issues.apache.org/jira/browse/SQOOP-324
> This is about using avro file(s) instead of plain-text file(s) as
> intermediate file while doing hive imports.
>
> I can think about couple of scenarios where avro might help (but not very
> sure) - binary data in database columns, data containing delimiters used
> for field/line separators etc.
>
> Has anyone tried implementing this? Any particular case that is solved by
> using avro as intermediate format?
>
>
> Thanks,
> Siddharth
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB