Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> Sqoop export to Netezza fails


Copy link to this message
-
Re: Sqoop export to Netezza fails
Can you please send the complete task log and sqoop client log?

Venkat
On Thu, May 30, 2013 at 12:02 AM, Frans Drijver <[EMAIL PROTECTED]>wrote:

> Hi there,
>
>
> I am trying to export a table to Netezza via Sqoop with the following
> command:
>
> /usr/lib/sqoop/bin/sqoop export --table WTFE_KRANT_FOTO_SQOOP_EXPORT
> --connect jdbc:netezza://<server>:5480/<database> --username
> <username> --password <password> --export-dir
> /apps/hive/warehouse/wtfe_krant_foto --verbose --direct
> --input-fields-terminated-by ";"
>
> However, I get the following (rather non-descript) errors. Partly copied
> below:
>
> 13/05/30 08:00:46 INFO mapred.JobClient: Task Id :
> attempt_201305221522_0327_m_000001_2, Status : FAILED
> java.lang.InterruptedException
> at java.lang.Object.wait(Native Method)
> at java.lang.Thread.join(Thread.java:1186)
> at java.lang.Thread.join(Thread.java:1239)
> at
> org.apache.sqoop.mapreduce.db.netezza.NetezzaExternalTableExportMapper.run(NetezzaExternalTableExportMapper.java:202)
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1178)
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>
> 13/05/30 08:00:46 INFO mapred.JobClient: Task Id :
> attempt_201305221522_0327_m_000003_2, Status : FAILED
> java.lang.RuntimeException: Error while running command to get file
> permissions : java.io.IOException: java.lang.InterruptedException
> at org.apache.hadoop.util.Shell.runCommand(Shell.java:258)
> at org.apache.hadoop.util.Shell.run(Shell.java:182)
> at
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
> at org.apache.hadoop.util.Shell.execCommand(Shell.java:461)
> at org.apache.hadoop.util.Shell.execCommand(Shell.java:444)
> at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:710)
> at
> org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:443)
> at
> org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.getOwner(RawLocalFileSystem.java:426)
> at org.apache.hadoop.mapred.TaskLog.obtainLogDirOwner(TaskLog.java:267)
> at
> org.apache.hadoop.mapred.TaskLogsTruncater.truncateLogs(TaskLogsTruncater.java:124)
> at org.apache.hadoop.mapred.Child$4.run(Child.java:260)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1178)
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>
> at
> org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:468)
> at
> org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.getOwner(RawLocalFileSystem.java:426)
> at org.apache.hadoop.mapred.TaskLog.obtainLogDirOwner(TaskLog.java:267)
> at
> org.apache.hadoop.mapred.TaskLogsTruncater.truncateLogs(TaskLogsTruncater.java:124)
> at org.apache.hadoop.mapred.Child$4.run(Child.java:260)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1178)
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
>
> This is with Sqoop 1.4.2.23.
>
> The table in question has 344367175 rows and 23 columns. Its owner is the
> same user as with which I am executing the Sqoop command and all
> permissions seem fine too.
>
> I have verified that the table exists in the Netezza database and the
> table in Hive is a straightforward text-based table.
>
> Anyone have any idea what I'm doing wrong?
>
> Thanks for any assistence!
>
>
> Kind regards,
>
>
> Frans Drijver
>

--
Regards

Venkat
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB