Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> sqoop export - parse exception


Copy link to this message
-
Re: sqoop export - parse exception
Hi Peleg,
thank you very much for your feedback! Would you mind sharing with us entire Sqoop log generated with parameter --verbose, but without the --input-escaped-by? I would be interested to see what exception was thrown. If the job is failing on mapreduce side, please also do share the failed map task log as it might contain additional information.

Jarcec

On Sun, Jun 09, 2013 at 03:36:10PM +0000, Peleg, Eyal wrote:
> I tried without using the " input-escaped-by" statement, and it threw an exception.
> Only when I did use this statement, and even though they both had the same value, the first table I mentioned went through successfully.
>
> -----Original Message-----
> From: Jarek Jarcec Cecho [mailto:[EMAIL PROTECTED]]
> Sent: Sunday, June 09, 2013 17:58
> To: [EMAIL PROTECTED]
> Subject: Re: sqoop export - parse exception
>
> Hi Peleg,
> I'm not sure that using the same value for both --input-fields-terminated-by and --input-escaped-by is correct. Is your input really delimited and at the same time escaped by tabulators?
>
> Jarcec
>
> On Sun, Jun 09, 2013 at 11:39:51AM +0000, Peleg, Eyal wrote:
> > My export command is as follows:
> >
> >
> > sqoop export --connect 'jdbc:sqlserver://papdb-dev.intel.com:3180;username=epmsysadm;password=s!peruser; DATABASE=AdvancedBIsystem' --table testing --export-dir  /user/eyapeleg/test --input-fields-terminated-by '\t'  --input-escaped-by '\t'  --lines-terminated-by '\n' --username epmsysadm --password 's!peruser'
> >
> > *note! I used tab delimited format.
> >
> > I'm able to export the following table:
> >
> > a              a
> > b             b
> >
> > but  fail to export the next one:
> >
> > aa           a
> > bb           a
> >
> > I get a parse exception:
> >
> > java.io.IOException: Could not parse record: aa a
> >         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:80)
> >         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
> >         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> >         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:183)
> >         at org.apache.hadoop.mapred.MapTask.runNewMapper_aroundBody4(MapTask.java:813)
> >         at org.apache.hadoop.mapred.MapTask$AjcClosure5.run(MapTask.java:1)
> >         at
> > org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:1
> > 49)
> > ...
> >
> > Best Regards,
> >
> >
> >
> >
> > ---------------------------------------------------------------------
> > Intel Electronics Ltd.
> >
> > This e-mail and any attachments may contain confidential material for
> > the sole use of the intended recipient(s). Any review or distribution
> > by others is strictly prohibited. If you are not the intended
> > recipient, please contact the sender and delete all copies.
> ---------------------------------------------------------------------
> Intel Electronics Ltd.
>
> This e-mail and any attachments may contain confidential material for
> the sole use of the intended recipient(s). Any review or distribution
> by others is strictly prohibited. If you are not the intended
> recipient, please contact the sender and delete all copies.
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB