Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> export job failed with oracle


Copy link to this message
-
export job failed with oracle
Hi All

   I do an export job to export data to my oracle10g database :
/home/sqoop-1.4.1-cdh4.1.2/bin/sqoop export --connect
jdbc:oracle:thin:@10.167.14.225:1521:wxoss -username XUJINGYU -password
123456  --export-dir sqoop/NMS_CMTS_CPU_CDX --table NMS_CMTS_CPU_CDX_TEST
--input-fields-terminated-by  "|"

  However I  get the excetion[1].
  It is weird because My import job from oracle  succeed.

  Any suggestion will appreciated.

Thank you.
[1]==========================================================...
13/04/19 10:12:17 INFO mapreduce.Job: The url to track the job:
http://Hadoop01:8088/proxy/application_1364348895095_0040/
13/04/19 10:12:17 INFO mapreduce.Job: Running job: job_1364348895095_0040
13/04/19 10:12:30 INFO mapreduce.Job: Job job_1364348895095_0040 running in
uber mode : false
13/04/19 10:12:30 INFO mapreduce.Job:  map 0% reduce 0%
13/04/19 10:12:40 INFO mapreduce.Job: Task Id :
attempt_1364348895095_0040_m_000002_0, Status : FAILED
Error: java.lang.RuntimeException:
java.lang.reflect.InvocationTargetException
        at
org.apache.sqoop.mapreduce.CombineFileRecordReader.initNextRecordReader(CombineFileRecordReader.java:166)
        at
org.apache.sqoop.mapreduce.CombineFileRecordReader.<init>(CombineFileRecordReader.java:125)
        at
org.apache.sqoop.mapreduce.ExportInputFormat.createRecordReader(ExportInputFormat.java:94)
        at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:455)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:697)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at
org.apache.sqoop.mapreduce.CombineFileRecordReader.initNextRecordReader(CombineFileRecordReader.java:148)
        ... 10 more
Caused by: java.net.ConnectException: Call From Hadoop04/10.167.14.224 to
Hadoop01:8020 failed on connection exception: java.net.ConnectException:
Connection refused; For more details see:
http://wiki.apache.org/hadoop/ConnectionRefused
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:721)
        at org.apache.hadoop.ipc.Client.call(Client.java:1164)
        at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
        at $Proxy10.getFileInfo(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
        at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
        at $Proxy10.getFileInfo(Unknown Source)
        at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:628)
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1507)
        at
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:783)
        at
org.apache.sqoop.mapreduce.ExportJobBase.getFileType(ExportJobBase.java:110)
        at
org.apache.sqoop.mapreduce.ExportJobBase.isSequenceFiles(ExportJobBase.java:98)
        at
org.apache.sqoop.mapreduce.CombineShimRecordReader.createChildReader(CombineShimRecordReader.java:120)
        at
org.apache.sqoop.mapreduce.CombineShimRecordReader.<init>(CombineShimRecordReader.java:60)
        ... 15 more
Caused by: java.net.ConnectException: Connection refused
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
        at
org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:207)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:523)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:488)
        at
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:476)
        at
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:570)
        at
org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:220)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1213)
        at org.apache.hadoop.ipc.Client.call(Client.java:1140)
        ... 31 more
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB