Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop, mail # user - Sqoop export to Teradata


Copy link to this message
-
Re: Sqoop export to Teradata
Jarek Jarcec Cecho 2013-09-30, 17:09
Hi Dipesh,
Sqoop by default will generate insert statements that have multiple rows in format:

  INSERT INTO table VALUES(), (), (), ...

This particular SQL extension is very common in database world, but it's unfortunately not supported by Teradata. As a result you have to turn it off by setting property sqoop.export.records.per.statement to 1, for example:

  sqoop export -Dsqoop.export.records.per.statement=1 --connect ...

Jarcec

On Sat, Sep 28, 2013 at 10:13:00PM +0530, Dipesh Kumar Singh wrote:
> Hello Users,
>
> I am getting the following error while exporting the data from HDFS to
> Teradata.
>
> *Sqoop Command used : *
> *
> *
> sqoop export --verbose \
> --driver com.teradata.jdbc.TeraDriver \
> --connect jdbc:teradata://
> hostname.domain.com/DATABASE=DW1_DAWS_WORK,TMODE=ANSI,LOGMECH=LDAP \
> --export-dir /user/myusername/TDETAILS_ADDR_CLT_DLY_sqoop \
> --table TDETAILS_ADDR_CLT_DLY_d   \
> --input-fields-terminated-by '|' \
> --input-escaped-by '\\' \
> //--lines-terminated-by '\n' \
> --username myusername \
> --num-mappers 8 \
> -P
>
> Though, i am successfully able to do sqoop import. The output of which is
> /user/myusername/TDETAILS_ADDR_CLT_DLY_sqoop
>
> *ERROR snip / Stack trace on Sqoop export:*
> *
> *
> 13/09/27 09:20:37 INFO mapred.JobClient: Running job: job_201309191609_5631
> 13/09/27 09:20:38 INFO mapred.JobClient:  map 0% reduce 0%
> 13/09/27 09:20:45 INFO mapred.JobClient: Task Id :
> attempt_201309191609_5631_m_000000_0, Status : FAILED
> java.io.IOException: com.teradata.jdbc.jdbc_4.util.JDBCException: [Teradata
> Database] [TeraJDBC 13.00.00.07] [Error 3706] [SQLState 42000] Syntax
> error: expected something between ')' and ','.
>         at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)
>         at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
>         at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:531)
>         at
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:78)
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>         at org.apache.hadoop.mapred.Child$4.run(Child.ja
> 13/09/27 09:20:46 INFO mapred.JobClient: Task Id :
> attempt_201309191609_5631_m_000004_0, Status : FAILED
> java.io.IOException: com.teradata.jdbc.jdbc_4.util.JDBCException: [Teradata
> Database] [TeraJDBC 13.00.00.07] [Error 3706] [SQLState 42000] Syntax
> error: expected something between ')' and ','.
>         at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)
>         at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
>         at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:531)
>         at
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:78)
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>         at org.apache.hadoop.mapred.Child$4.run(Child.ja
> 13/09/27 09:20:46 INFO mapred.JobClient: Task Id :
> attempt_201309191609_5631_m_000005_0, Status : FAILED