Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop, mail # user - Re: Timestamp not supported in sqoop2


Copy link to this message
-
Re: Timestamp not supported in sqoop2
Jarek Jarcec Cecho 2013-11-02, 01:19
Hi Yash,
would you mind sharing with us the job details? (output of command show job --jid X)

Jarcec

On Fri, Nov 01, 2013 at 09:41:03AM -0700, Yash Ranadive wrote:
> Abraham,
>
> I'm using 1.99.2. Thanks Vasanth looks like it is not supported.
>
> Yash
>
>
> On Wed, Oct 30, 2013 at 11:03 AM, vasanth kumar
> <[EMAIL PROTECTED]>wrote:
>
> > Hi,
> > Date and timestamp data type is not supported while importing. Try below
> > query in job
> >
> > select CAST(ts as CHAR(50)) ts from test where ${CONDITIONS}
> >
> > Try in latest unreleased version where in 1.99.2 above query does not fits
> > in Job's "Table SQL statement" field.
> >
> > Thanks,
> > Vasanth kumar
> >
> >
> > On Wed, Oct 30, 2013 at 4:34 AM, Abraham Elmahrek <[EMAIL PROTECTED]>wrote:
> >
> >> Yash,
> >>
> >> What version/distribution of Sqoop 2 are you using? This exception is
> >> very weird. It's occurring in a section of code that shouldn't be hit.
> >>
> >> -Abe
> >>
> >>
> >> On Tue, Oct 22, 2013 at 12:59 PM, Yash Ranadive <
> >> [EMAIL PROTECTED]> wrote:
> >>
> >>> Here's the schema of the mysql table
> >>>
> >>> CREATE TABLE `test` (
> >>>   `id` int(11) DEFAULT NULL,
> >>>   `value` varchar(255) DEFAULT NULL,
> >>>   `zid` int(11) NOT NULL AUTO_INCREMENT,
> >>>   `ts` timestamp NOT NULL DEFAULT '0000-00-00 00:00:00' ON UPDATE
> >>> CURRENT_TIMESTAMP,
> >>>   PRIMARY KEY (`zid`)
> >>> ) ENGINE=InnoDB AUTO_INCREMENT=19 DEFAULT CHARSET=latin1
> >>>
> >>>
> >>> On Tue, Oct 22, 2013 at 12:17 PM, Yash Ranadive <
> >>> [EMAIL PROTECTED]> wrote:
> >>>
> >>>> I see the following error in the MapReduce logs when trying to execute
> >>>> a sqoop2 job that gets data from a mysql table with a timestamp column. A
> >>>> table with no timestamps gets loaded fine.
> >>>>
> >>>> Does sqoop2 not support timestamp columns? Sounds counterintuitive.
> >>>>
> >>>>
> >>>>
> >>>>
> >>>> 2013-10-22 12:03:02,570 FATAL [IPC Server handler 2 on 42160] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1381794869316_1175_m_000000_0 - exited : org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0017:Error occurs during extractor run
> >>>> at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:98)
> >>>> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:756)
> >>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:338)
> >>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:157)
> >>>> at java.security.AccessController.doPrivileged(Native Method)
> >>>> at javax.security.auth.Subject.doAs(Subject.java:415)
> >>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
> >>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:152)
> >>>> Caused by: org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0013:Cannot write to the data writer
> >>>> at org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:146)
> >>>> at org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeArrayRecord(SqoopMapper.java:128)
> >>>> at org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:61)
> >>>> at org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:31)
> >>>> at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:93)
> >>>> ... 7 more
> >>>> Caused by: java.io.IOException: org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0012:The type is not supported - java.sql.Timestamp
> >>>> at org.apache.sqoop.job.io.Data.writeArray(Data.java:309)
> >>>> at org.apache.sqoop.job.io.Data.write(Data.java:171)
> >>>> at org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:100)
> >>>> at org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:84)
> >>>> at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1112)
> >>>> at org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:685)