Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Sqoop >> mail # user >> Sqoop connection to Informix Database


+
James Hogarth 2013-01-17, 11:00
Copy link to this message
-
Re: Sqoop connection to Informix Database
Hi James,
I'm expecting that this exception is being raised on mapper side, would you mind sharing entire map task log with us?

Also, would you mind sharing your Hadoop and Sqoop version? Sqoop command line that you're using might be also helpful.

Jarcec

On Thu, Jan 17, 2013 at 11:00:16AM +0000, James Hogarth wrote:
> Hi all,
>
> I'm trying to integrate our hadoop infrastructure with another team's
> Informix database...
>
> I have the JDBC drivers in place and test queries to most tables are fine
> but we're hitting a couple of trip ups...
>
> Some of the tables have grown in size and the primary key has been changed
> to from serial to serial8 to cope with this.
>
> When a query is run against such a table a java exception is raised with
> the following stack trace:
>
> java.io.IOException: SQLException in nextKeyValue
> at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>
>
> Has anyone encountered this or have any ideas how to work around it?
>
>
> Kind regards,
>
>
> James
+
James Hogarth 2013-01-17, 13:02
+
James Hogarth 2013-01-17, 16:48
+
Jarek Jarcec Cecho 2013-01-22, 01:03
+
James Hogarth 2013-01-28, 09:53