Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # dev >> how to modify or use sqoop to write to a different destination


Copy link to this message
-
Re: how to modify or use sqoop to write to a different destination
thanks. importing into HDFS is fine for now. but i have another question
now.

let's say i have 3 servers.

W: web server
H: hadoop server
D: database server

what i want to do is use sqoop on W to import data from D to H.
unfortunately, H is locked down (no new software besides hadoop may be
installed on it for now). is this scenario possible with sqoop? from
reading the documentation, it seems sqoop has to be installed on H and run
from H, but H is a cluster of restricted modifications.

please note that i am experimenting with sqoop 1.4.3.

On Wed, Jun 5, 2013 at 5:30 PM, Jarek Jarcec Cecho <[EMAIL PROTECTED]>wrote:

> Hi Jane,
> Sqoop currently supports import into HDFS, Hive and HBase. One possible
> workaround to import data into different system would be to import data
> into HDFS and then export them back, just somewhere else.
>
> Jarcec
>
> On Wed, Jun 05, 2013 at 04:28:49PM -0400, Jane Wayne wrote:
> > hi,
> >
> > as i understand, when sqoop imports from a rdbms, it can import directly
> > into to hdfs or hive. however, i would like to import into a different
> > destination (perhaps a different NoSQL store). how can i do this? is
> there
> > a "hook" somewhere in the API?
> >
> > thanks,
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB