Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Sqoop >> mail # user >> sqoop mysql to hive's ---split-by bug


Copy link to this message
-
sqoop mysql to hive's ---split-by bug
hi all

My mysql table below, which id_sign is bigint (20) unsigned
When I set - split-by id_sign time, appeared the following exceptions, I
carefully looked at the code and found that the problem exists here
     long minVal = results.getLong (1);
     long maxVal = results.getLong (2);

I think when the split-by for the bigint unsigned, they can mapping of
String, here commit it?

Thank you
+-----------------------+-----------------------+------+-----+---------+-------+

| Field                 | Type                  | Null | Key | Default |
Extra |

+-----------------------+-----------------------+------+-----+---------+-------+

| id_sign               | bigint(20) unsigned   | NO   | PRI | NULL    |
    |

java.io.IOException: com.mysql.jdbc.exceptions.jdbc4.MySQLDataException:
'18385347166395843554' in column '2' is outside valid range for the
datatype BIGINT.

14/01/06 15:25:30 ERROR tool.ImportTool: Encountered IOException running
import job: java.io.IOException:
com.mysql.jdbc.exceptions.jdbc4.MySQLDataException: '18385347166395843554'
in column '2' is outside valid range for the datatype BIGINT.

        at
org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.getSplits(DataDrivenDBInputFormat.java:170)

        at
org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1024)

        at
org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1041)

        at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)

        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959)

        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:396)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)

        at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912)

        at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)

        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)

        at
org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:186)

        at
org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:159)

        at
org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:239)

        at
org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:600)

        at
org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)

        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:413)

        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)

        at org.apache.sqoop.Sqoop.run(Sqoop.java:145)

        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)

        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)

        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)

        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)

        at org.apache.sqoop.Sqoop.main(Sqoop.java:238)

Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLDataException:
'18385347166395843554' in column '2' is outside valid range for the
datatype BIGINT.

--
[EMAIL PROTECTED]|Hu
+
Jarek Jarcec Cecho 2014-01-09, 08:46
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB