Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Sqoop >> mail # user >> Using Sqoop incremental import as chunk


+
Tanzir Musabbir 2013-05-08, 17:00
Copy link to this message
-
Re: Using Sqoop incremental import as chunk
--boundary-query

http://sqoop.apache.org/docs/1.4.3/SqoopUserGuide.html#_connecting_to_a_database_server

--
Felix
On Wed, May 8, 2013 at 1:00 PM, Tanzir Musabbir <[EMAIL PROTECTED]>wrote:

>  Hello everyone,
>
> Is it really possible to import chunk-wise data through sqoop incremental
> import?
>
> Say I have a table with id 1,2,3..... N (here N is 100) and now I want to
> import it as chunk. Like
> 1st import: 1,2,3.... 20
> 2nd import: 21,22,23.....40
> last import: 81,82,83....100
>
> I have read about the Sqoop job with incremental import and also know the
> --last-value parameter but do not know how to pass the chunk size. For the
> above example, chunk size here is 20.
>
>
> Any information will be highly appreciated. Thanks in advance.
>
+
Jarek Jarcec Cecho 2013-05-08, 18:08
+
Tanzir Musabbir 2013-05-08, 18:17
+
Jarek Jarcec Cecho 2013-05-08, 18:23
+
Tanzir Musabbir 2013-05-09, 15:31
+
Felix GV 2013-05-08, 18:24
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB