Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> Sqoop import of packed data


Copy link to this message
-
RE: Sqoop import of packed data
Hi Jarcec,

Thanks for the info.

-Ganesh

-----Original Message-----
From: Jarek Jarcec Cecho [mailto:[EMAIL PROTECTED]]
Sent: Monday, October 14, 2013 11:08 AM
To: [EMAIL PROTECTED]
Subject: Re: Sqoop import of packed data

Hi Ganesh,
Sqoop is using DB2 JDBC connector to talk with DB2 running on mainframes.
The data unpacking is actually done by the JDBC Connector itself as
otherwise the connector would not be compliant with the JDBC specification.
Sqoop is taking advantage of that to transfer your data into HDFS unpacked.

Jarcec

On Thu, Oct 10, 2013 at 08:27:05AM -0400, Ganesh Kumar Narayanan wrote:
> Hi all,
>
>  
>
> I am about to evaluate Sqoop import from AS/400 DB2 using the IBM JDBC
Connector.
>
>  
>
> I have a question about the capabilities of Sqoop import. Does Sqoop
import handle packed data from AS/400? If it does, is it possible to get it
unpacked during import?
>
>  
>
> Thanks.
>
>  
>
> -Ganesh
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB