Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Sqoop >> mail # user >> Sqoop export to Teradata


+
Dipesh Kumar Singh 2013-09-28, 16:43
+
Jarek Jarcec Cecho 2013-09-30, 17:09
+
Dipesh Kumar Singh 2013-10-10, 03:41
Copy link to this message
-
Re: Sqoop export to Teradata
Hortonworks and also Cloudera also) provide a sqoop connector that is based
on Teradta hadoop connector that optimizes the data transfer.   Fastload as
also batch updates for exporting data to Teradata and also a few options
for importing data are provided by the connector.

Please check the respective vendor websites for more information

Thanks

Venkat
On Wed, Oct 9, 2013 at 8:41 PM, Dipesh Kumar Singh <[EMAIL PROTECTED]>wrote:

> Thanks a ton Jarek.
>
> I am able to rectify that and run the command but it kills all the
> efficiency sqoop provides for bulk load.
>
> sqoop export -Dsqoop.export.records.per.statement=1 \
> --verbose \
> --driver com.teradata.jdbc.TeraDriver \
> --connect jdbc:teradata://
> hostname.com/DATABASE=DW1_DAWS_WORK,TMODE=ANSI,LOGMECH=LDAP \
> --export-dir /user/dipeshks/T_DLY_sqoop \
> --table T_DLY_d   \
> --input-fields-terminated-by '|' \
> --username myuser \
> --num-mappers 1 \
> --batch
> -P
>
> I wanted to know few things :
>
> 1> Is there some way, i can plug some custom code with sqoop and use
> Teradata FastLoad capabilities. It would be help if anyone can direct be to
> some references to start with.
>
>
> 2> I would like to have some explaination behind increase in number of
> mappers causing this deadlock error:
>
> java.io.IOException: com.teradata.jdbc.jdbc_4.util.JDBCException:
> [Teradata Database] [TeraJDBC 13.00.00.07] [Error 2631] [SQLState 40001]
> Transaction ABORTed due to deadlock.
>  at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)
> at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
>  at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:531)
> at
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
>  at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:78)
> at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>  at org.apache.hadoop.mapred.Child$4.run(Child.java:266)
> at java.
>
>
> Thanks & Regards,
> Dipesh
>
>
> On Mon, Sep 30, 2013 at 10:39 PM, Jarek Jarcec Cecho <[EMAIL PROTECTED]>wrote:
>
>> Hi Dipesh,
>> Sqoop by default will generate insert statements that have multiple rows
>> in format:
>>
>>   INSERT INTO table VALUES(), (), (), ...
>>
>> This particular SQL extension is very common in database world, but it's
>> unfortunately not supported by Teradata. As a result you have to turn it
>> off by setting property sqoop.export.records.per.statement to 1, for
>> example:
>>
>>   sqoop export -Dsqoop.export.records.per.statement=1 --connect ...
>>
>> Jarcec
>>
>> On Sat, Sep 28, 2013 at 10:13:00PM +0530, Dipesh Kumar Singh wrote:
>> > Hello Users,
>> >
>> > I am getting the following error while exporting the data from HDFS to
>> > Teradata.
>> >
>> > *Sqoop Command used : *
>> > *
>> > *
>> > sqoop export --verbose \
>> > --driver com.teradata.jdbc.TeraDriver \
>> > --connect jdbc:teradata://
>> > hostname.domain.com/DATABASE=DW1_DAWS_WORK,TMODE=ANSI,LOGMECH=LDAP \
>> > --export-dir /user/myusername/TDETAILS_ADDR_CLT_DLY_sqoop \
>> > --table TDETAILS_ADDR_CLT_DLY_d   \
>> > --input-fields-terminated-by '|' \
>> > --input-escaped-by '\\' \
>> > //--lines-terminated-by '\n' \
>> > --username myusername \
>> > --num-mappers 8 \
>> > -P
>> >
>> > Though, i am successfully able to do sqoop import. The output of which
>> is
>> > /user/myusername/TDETAILS_ADDR_CLT_DLY_sqoop
>> >
>> > *ERROR snip / Stack trace on Sqoop export:*
>> > *
>> > *
>> > 13/09/27 09:20:37 INFO mapred.JobClient: Running job:
>> job_201309191609_5631
>> > 13/09/27 09:20:38 INFO mapred.JobClient:  map 0% reduce 0%
>> > 13/09/27 09
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB