Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Sqoop >> mail # user >> Sqoop exporting error


+
Matthieu Labour 2012-11-02, 23:18
+
Jarek Jarcec Cecho 2012-11-03, 00:19
+
Matthieu Labour 2012-11-06, 18:17
+
Jarek Jarcec Cecho 2012-11-06, 19:33
+
Matthieu Labour 2012-11-06, 20:30
+
Matthieu Labour 2012-11-07, 18:10
Copy link to this message
-
Re: Sqoop exporting error
Hi Matthieu,
sorry for my late response.

Thank you for doing the research what might help fixing the issue. Sqoop is using prepared statement and therefore we're not explicitly escaping data values. I believe that the '1351716176767' is just the way PostgreSQL connector is displaying the error, not the actual problem. You can see much more number based columns escaped with quotes even in your example.

Would you mind sharing task log for this particular job? I would like to see the getNextException() call that should be there.

Jarcec

On Wed, Nov 07, 2012 at 01:10:10PM -0500, Matthieu Labour wrote:
> Hi Jarcec
>
> I did the following
>
> I changed the type of 'ts' column to decimal. ALTER TABLE
> ml_ys_log_gmt_test ALTER ts TYPE decimal;
>
>                                      Table "public.ml_ys_log_gmt_test"
>          Column         |           Type           |            Modifiers
>          | Storage  | Description
> ------------------------+--------------------------+---------------------------------+----------+-------------
>  ts                     | numeric                  |
>           | main     |
>
>
> Then I run the following command
>
> ~/$SQOOP_ROOT/bin/sqoop export --connect jdbc:postgresql://
> ec2-XX-XX-XXX-XX.compute-1.amazonaws.com:5662/dfg2tulf7263ut --username
> ufjb0gfs1n5kut --password xxxxxx --table ml_ys_log_gmt_test --export-dir
> $HADOOP_INPUT/$LOGS_HOME/dt=$(date +%Y-%m-%d)
> --input-fields-terminated-by='\t' --lines-terminated-by='\n' --verbose
> --batch
>
>
> 12/11/07 17:25:32 INFO mapred.JobClient: Task Id :
> attempt_201211071722_0002_m_000000_0, Status : FAILED
> java.io.IOException: java.sql.BatchUpdateException: Batch entry 71 INSERT
> INTO ml_ys_log_gmt_test (date, ts, environment, resource, network,
> advertiser, campaign, creative, cost, click, flavor, ui_element_id,
> ui_element_type, event, charge_id, charge_type, charge_vertical,
> charge_payer, charge_amount, model, imageformatsupport, inputmodesupport,
> scriptsupport, vendor, stylesheetsupport, markupsupport, displaywidth,
> cookiesupport, displayheight, version, inputdevices, displaycolordepth,
> device_os, device_os_version, mobile_browser, mobile_browser_version,
> is_tablet, dual_orientation, marketing_name) VALUES ('2012-10-31
> 20:42:56.767000 +00:00:00', '1351716176767',
> 'remote-addr=10.84.101.45&user-agent=LG-CT810%2FV10x+NetFront%2Fv3.5+Profile%2FMIDP-2.0+Configuration%2FCLDC-1.1',
> 'hxCA', 'MLNL', '1006', '10014', '1410', NULL, 'mdkP', '10014', '', '',
> 'click', '', '', '', '', NULL, 'CT810', '', '', '', 'LG', '', '', '400',
> '', '240', '', 'stylus', '', 'Windows Mobile OS', '6.1', 'Microsoft Mobile
> Explorer', '7.11', 'false', 'false', '') was aborted.  Call
> getNextException to see the cause.
>         at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)
>         at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
>         at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:646)
>         at
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:78)
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:771)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:375)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1132)
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
+
Matthieu Labour 2012-11-09, 18:26
+
Jarek Jarcec Cecho 2012-11-09, 23:53
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB