Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop, mail # user - Sqoop exporting error


Copy link to this message
-
Re: Sqoop exporting error
Jarek Jarcec Cecho 2012-11-09, 23:53
You're welcome. I'm glad that Sqoop has started working for you!

Jarcec

On Fri, Nov 09, 2012 at 01:26:56PM -0500, Matthieu Labour wrote:
> Jarcec
> Thank you for your email and for giving me the idea. You are right. In that
> case the issue was that the value for device_os was too long
> Now it works
> So in brief changing the type of 'ts' from bigint to numeric solved the
> issue
> Thanks again
> Matthieu
>
> On Thu, Nov 8, 2012 at 12:11 PM, Jarek Jarcec Cecho <[EMAIL PROTECTED]>wrote:
>
> > Hi Matthieu,
> > sorry for my late response.
> >
> > Thank you for doing the research what might help fixing the issue. Sqoop
> > is using prepared statement and therefore we're not explicitly escaping
> > data values. I believe that the '1351716176767' is just the way PostgreSQL
> > connector is displaying the error, not the actual problem. You can see much
> > more number based columns escaped with quotes even in your example.
> >
> > Would you mind sharing task log for this particular job? I would like to
> > see the getNextException() call that should be there.
> >
> > Jarcec
> >
> > On Wed, Nov 07, 2012 at 01:10:10PM -0500, Matthieu Labour wrote:
> > > Hi Jarcec
> > >
> > > I did the following
> > >
> > > I changed the type of 'ts' column to decimal. ALTER TABLE
> > > ml_ys_log_gmt_test ALTER ts TYPE decimal;
> > >
> > >                                      Table "public.ml_ys_log_gmt_test"
> > >          Column         |           Type           |            Modifiers
> > >          | Storage  | Description
> > >
> > ------------------------+--------------------------+---------------------------------+----------+-------------
> > >  ts                     | numeric                  |
> > >           | main     |
> > >
> > >
> > > Then I run the following command
> > >
> > > ~/$SQOOP_ROOT/bin/sqoop export --connect jdbc:postgresql://
> > > ec2-XX-XX-XXX-XX.compute-1.amazonaws.com:5662/dfg2tulf7263ut --username
> > > ufjb0gfs1n5kut --password xxxxxx --table ml_ys_log_gmt_test --export-dir
> > > $HADOOP_INPUT/$LOGS_HOME/dt=$(date +%Y-%m-%d)
> > > --input-fields-terminated-by='\t' --lines-terminated-by='\n' --verbose
> > > --batch
> > >
> > >
> > > 12/11/07 17:25:32 INFO mapred.JobClient: Task Id :
> > > attempt_201211071722_0002_m_000000_0, Status : FAILED
> > > java.io.IOException: java.sql.BatchUpdateException: Batch entry 71 INSERT
> > > INTO ml_ys_log_gmt_test (date, ts, environment, resource, network,
> > > advertiser, campaign, creative, cost, click, flavor, ui_element_id,
> > > ui_element_type, event, charge_id, charge_type, charge_vertical,
> > > charge_payer, charge_amount, model, imageformatsupport, inputmodesupport,
> > > scriptsupport, vendor, stylesheetsupport, markupsupport, displaywidth,
> > > cookiesupport, displayheight, version, inputdevices, displaycolordepth,
> > > device_os, device_os_version, mobile_browser, mobile_browser_version,
> > > is_tablet, dual_orientation, marketing_name) VALUES ('2012-10-31
> > > 20:42:56.767000 +00:00:00', '1351716176767',
> > >
> > 'remote-addr=10.84.101.45&user-agent=LG-CT810%2FV10x+NetFront%2Fv3.5+Profile%2FMIDP-2.0+Configuration%2FCLDC-1.1',
> > > 'hxCA', 'MLNL', '1006', '10014', '1410', NULL, 'mdkP', '10014', '', '',
> > > 'click', '', '', '', '', NULL, 'CT810', '', '', '', 'LG', '', '', '400',
> > > '', '240', '', 'stylus', '', 'Windows Mobile OS', '6.1', 'Microsoft
> > Mobile
> > > Explorer', '7.11', 'false', 'false', '') was aborted.  Call
> > > getNextException to see the cause.
> > >         at
> > >
> > org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:220)
> > >         at
> > >
> > org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
> > >         at
> > >
> > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:646)
> > >         at
> > >
> > org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
> > >         at
> > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:78)