Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Sqoop >> mail # user >> Sqoop Incrementail import fails


Copy link to this message
-
Sqoop Incrementail import fails
 Hello All

I have used this sqoop incremental import ..  as the data is in huge volume

sqoop import -libjars
--driver com.sybase.jdbc3.jdbc.SybDriver \
--query "select * from
from ${EMP} where \$CONDITIONS and SAL > ${AA} and SAL <= ${BB}" \
--check-column feed_key \
--incremental append \
--last-value ${AA} \
--split-by DEPT \
--fields-terminated-by ',' \
--target-dir ${TARGET_DIR}/${INC} \
--username ${SYBASE_USERNAME} \
--password ${SYBASE_PASSWORD} \

and the error I got is

*sqoop/01224054700null/_temporary/_attempt_201311150154_20546_m_000000_0/part-m-00000
(Disk quota*

*While same issue I faced while 1st time import ( to import last 19 months
data), I i get rid of easly by puttion it into a loop..  and changing the
targed directry for every loop *
for i in `seq 0 18`
do
sqoop import -libjars ${SQOOP_SYBASE_JAR} -D
mapred.job.queue.name=${HADOOP_JOB_QUEUE} --verbose \
--driver com.sybase.jdbc3.jdbc.SybDriver \
--connect ${CONNECTION_STRING} \
--query "select * from
from ${EMP} where \$CONDITIONS and SAL > ${AA} and SAL <= ${BB}" \
--split-by cycle_cd \
--fields-terminated-by ',' \

*--target-dir ${TARGET_DIR}/${APAC_MONTHLY_HIST}/${i} \*

--username ${SYBASE_USERNAME} \
--password ${SYBASE_PASSWORD} \

Pls Help and suggest.

Thanks
Yogesh
+
Jarek Jarcec Cecho 2014-01-06, 09:33
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB