Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Sqoop >> mail # user >> Sqoop export to Teradata


+
Dipesh Kumar Singh 2013-09-28, 16:43
+
Jarek Jarcec Cecho 2013-09-30, 17:09
Copy link to this message
-
Re: Sqoop export to Teradata
Thanks a ton Jarek.

I am able to rectify that and run the command but it kills all the
efficiency sqoop provides for bulk load.

sqoop export -Dsqoop.export.records.per.statement=1 \
--verbose \
--driver com.teradata.jdbc.TeraDriver \
--connect jdbc:teradata://
hostname.com/DATABASE=DW1_DAWS_WORK,TMODE=ANSI,LOGMECH=LDAP \
--export-dir /user/dipeshks/T_DLY_sqoop \
--table T_DLY_d   \
--input-fields-terminated-by '|' \
--username myuser \
--num-mappers 1 \
--batch
-P

I wanted to know few things :

1> Is there some way, i can plug some custom code with sqoop and use
Teradata FastLoad capabilities. It would be help if anyone can direct be to
some references to start with.
2> I would like to have some explaination behind increase in number of
mappers causing this deadlock error:

java.io.IOException: com.teradata.jdbc.jdbc_4.util.JDBCException: [Teradata
Database] [TeraJDBC 13.00.00.07] [Error 2631] [SQLState 40001] Transaction
ABORTed due to deadlock.
 at
org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)
at
org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
 at
org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:531)
at
org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
 at
org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:78)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
 at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
 at org.apache.hadoop.mapred.Child$4.run(Child.java:266)
at java.
Thanks & Regards,
Dipesh
On Mon, Sep 30, 2013 at 10:39 PM, Jarek Jarcec Cecho <[EMAIL PROTECTED]>wrote:

> Hi Dipesh,
> Sqoop by default will generate insert statements that have multiple rows
> in format:
>
>   INSERT INTO table VALUES(), (), (), ...
>
> This particular SQL extension is very common in database world, but it's
> unfortunately not supported by Teradata. As a result you have to turn it
> off by setting property sqoop.export.records.per.statement to 1, for
> example:
>
>   sqoop export -Dsqoop.export.records.per.statement=1 --connect ...
>
> Jarcec
>
> On Sat, Sep 28, 2013 at 10:13:00PM +0530, Dipesh Kumar Singh wrote:
> > Hello Users,
> >
> > I am getting the following error while exporting the data from HDFS to
> > Teradata.
> >
> > *Sqoop Command used : *
> > *
> > *
> > sqoop export --verbose \
> > --driver com.teradata.jdbc.TeraDriver \
> > --connect jdbc:teradata://
> > hostname.domain.com/DATABASE=DW1_DAWS_WORK,TMODE=ANSI,LOGMECH=LDAP \
> > --export-dir /user/myusername/TDETAILS_ADDR_CLT_DLY_sqoop \
> > --table TDETAILS_ADDR_CLT_DLY_d   \
> > --input-fields-terminated-by '|' \
> > --input-escaped-by '\\' \
> > //--lines-terminated-by '\n' \
> > --username myusername \
> > --num-mappers 8 \
> > -P
> >
> > Though, i am successfully able to do sqoop import. The output of which is
> > /user/myusername/TDETAILS_ADDR_CLT_DLY_sqoop
> >
> > *ERROR snip / Stack trace on Sqoop export:*
> > *
> > *
> > 13/09/27 09:20:37 INFO mapred.JobClient: Running job:
> job_201309191609_5631
> > 13/09/27 09:20:38 INFO mapred.JobClient:  map 0% reduce 0%
> > 13/09/27 09:20:45 INFO mapred.JobClient: Task Id :
> > attempt_201309191609_5631_m_000000_0, Status : FAILED
> > java.io.IOException: com.teradata.jdbc.jdbc_4.util.JDBCException:
> [Teradata
> > Database] [TeraJDBC 13.00.00.07] [Error 3706] [SQLState 42000] Syntax
> > error: expected something between ')' and ','.
> >         at
> >
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)
> >         at
> >
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
> >         at
> >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:531)

Dipesh Kr. Singh
+
Venkat Ranganathan 2013-10-10, 04:02