Erik Selin 2013-08-15, 21:36
Venkat Ranganathan 2013-08-15, 23:59
Jarek Jarcec Cecho 2013-08-18, 02:13
Erik Selin 2013-08-16, 05:43
Please file a JIRA and submit a patch. My suggestion was to keep the data
files in JSON but by creating them as Hive tables with JSON serde and use
Sqoop to import into those tables, but may not be an option in all cases
On Thu, Aug 15, 2013 at 10:43 PM, Erik Selin <[EMAIL PROTECTED]> wrote:
> Hey Venkat, Devs,
> My use case for json output is that the environment I work in is
> standardizing on keeping all data sources as json. Adding an extra pig
> script or something similar to convert all sqoop output to json is just
> extra work that, in my opinion, should simply be handled by sqoop. Sqoop
> seems especially suited for this since it is already aware of column names
> and data types which is making the creation of solid json seriously
> Let me know if there's any other questions :)
> On 2013-08-15, at 17:36 , Erik Selin <[EMAIL PROTECTED]> wrote:
> > Hey devs,
> > I needed my data in json in hdfs so I hacked support for it into sqoop.
> Is this something you would be interested in adding to the main sqoop code?
> > repo: https://github.com/tyro89/json-sqoop
> > note that I also added public to JAVA_RESERVED_WORDS.
> > Cheers,
> > Erik
NOTICE: This message is intended for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.