Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> Sqoop export failed: Incorrect syntax near ','


Copy link to this message
-
Re: Sqoop export failed: Incorrect syntax near ','
I've just realized what is wrong, parameter --hive-drop-import-delims is import specific and it seems that you're trying to use it for export. Would you mind to re-import your data with this parameter and try the export?

Jarcec

On Fri, Dec 07, 2012 at 03:45:55PM +0800, Chun-fan Ivan Liao wrote:
> I've upgrade sqoop to 1.4.2 and copied hadoop-core-1.0.3.jar
> and sqljdbc4.jar to /usr/local/sqoop/lib. I've also specified the parameter
> "--hive-drop-import-delims" in the command, but the same error remained.
> Parameters specified behind --hive-drop-import-delims could not be parsed:
>
> =============> 12/12/06 23:39:07 ERROR tool.BaseSqoopTool: Error parsing arguments for
> export:
> 12/12/06 23:39:07 ERROR tool.BaseSqoopTool: Unrecognized argument:
> --hive-drop-import-delims
> 12/12/06 23:39:07 ERROR tool.BaseSqoopTool: Unrecognized argument: --verbose
>
> Try --help for usage instructions.
> ....
> =============>
> Is there anything I can do now, e.g. re-importing data using default
> connector and see if the imported data can be exported back to SQL Server?
>
>
> On Fri, Dec 7, 2012 at 12:01 PM, Jarek Jarcec Cecho <[EMAIL PROTECTED]>wrote:
>
> > I see, Would you mind upgrading your Sqoop to most recent version 1.4.2?
> >
> > Jarcec
> >
> > On Fri, Dec 07, 2012 at 11:19:31AM +0800, Chun-fan Ivan Liao wrote:
> > > Hi Jarek,
> > >
> > > I've tried to use "--hive-drop-import-delims", but sqoop showed it has
> > > syntax error:
> > >
> > >   ERROR tool.BaseSqoopTool: Unrecognized argument:
> > --hive-drop-import-delims
> > >
> > > Also, should I change Java OpenJDK to Oracle JDK in order to make Sqoop
> > > export work?
> > >
> > > Thanks!
> > > Ivan
> > >
> > > On Fri, Dec 7, 2012 at 12:28 AM, Jarek Jarcec Cecho <[EMAIL PROTECTED]
> > >wrote:
> > >
> > > > Hi Ivangelion,
> > > > I'm glad that you were able to move on with your issue. It seems to me
> > > > that you're running on OpenJDK - unfortunately Sqoop is tested and
> > > > supported only Oracle JDK.
> > > >
> > > > Based on the exceptions you're hitting:
> > > >
> > > >   java.lang.NumberFormatException: For input string: "Male"
> > > >
> > > >   java.lang.IllegalArgumentException: Timestamp format must be
> > yyyy-mm-dd
> > > > hh:mm:ss[.fffffffff
> > > >
> > > > It seems to me your input files got somehow corrupted and for example
> > for
> > > > the first exception Sqoop is looking for column that should be number
> > but
> > > > found string "Male" instead. You've mentioned that your data can
> > contain a
> > > > lot of wild characters, can it happen that your data also contains new
> > line
> > > > characters? Would you mind re-trying import with parameter
> > > > --hive-drop-import-delims [1] to see if it helps? (this parameter do
> > not
> > > > depend on hive in any way regardless of it's name).
> > > >
> > > > Jarcec
> > > >
> > > > On Thu, Dec 06, 2012 at 12:03:06PM +0800, Ivangelion wrote:
> > > > > Hi Jarek,
> > > > >
> > > > > It actually worked! Thank you so much~! :D
> > > > >
> > > > > However now we faced another problem. The former data we tried to
> > export
> > > > is
> > > > > only test data, which row count is only 10. When we tried to export
> > > > > production data back into SQL Server from HDFS file which was
> > previously
> > > > > imported using Sqoop from SQL server, different errors occurred. The
> > row
> > > > > count is about 400k, and only about 120k rows were exported. This
> > time we
> > > > > used "-m 5", and if using "-m 1", nothing will be exported. Verbose
> > log
> > > > is
> > > > > in the bottom of this mail.
> > > > >
> > > > > Is this has to do with that we used MS SQL connector to do previous
> > > > import,
> > > > > not the default one?
> > > > >
> > > > > Also, should we specify any character encoding, e.g. utf-8 during
> > > > > import/export process? There are characters of many different
> > languages
> > > > in
> > > > > our original data in SQL Server, and I'm not sure what the encoding
> > is
> > > > > after imported into HDFS.
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB