Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Sqoop >> mail # user >> export job failed with oracle


+
YouPeng Yang 2013-04-19, 04:07
Copy link to this message
-
Re: export job failed with oracle
Hi All

   I think I get the reason.
   There is a DATE column in my table  NMS_CMTS_CPU_CDX_TEST .

   And I found the same error on this URL:
https://groups.google.com/a/cloudera.org/forum/?fromgroups=#!topic/sqoop-user/I0zqhKOdOyQ
  According to the URL,I install the  Quest Data Connector for Oracle and
Hadoop,and It goes well when I export  without the DATE column in my table
 and the corresponding dataset in my HDFS file.
  On the other hand,When I do the above export job again ,the exception
come out.

  Anyone encountered with the same trouble,or Any suggestion?
Regards
2013/4/19 YouPeng Yang <[EMAIL PROTECTED]>

> Hi All
>
>    I do an export job to export data to my oracle10g database :
> /home/sqoop-1.4.1-cdh4.1.2/bin/sqoop export --connect
> jdbc:oracle:thin:@10.167.14.225:1521:wxoss -username XUJINGYU -password
> 123456  --export-dir sqoop/NMS_CMTS_CPU_CDX --table NMS_CMTS_CPU_CDX_TEST
> --input-fields-terminated-by  "|"
>
>   However I  get the excetion[1].
>   It is weird because My import job from oracle  succeed.
>
>   Any suggestion will appreciated.
>
> Thank you.
>
>
> [1]==========================================================> ...
> 13/04/19 10:12:17 INFO mapreduce.Job: The url to track the job:
> http://Hadoop01:8088/proxy/application_1364348895095_0040/
> 13/04/19 10:12:17 INFO mapreduce.Job: Running job: job_1364348895095_0040
> 13/04/19 10:12:30 INFO mapreduce.Job: Job job_1364348895095_0040 running
> in uber mode : false
> 13/04/19 10:12:30 INFO mapreduce.Job:  map 0% reduce 0%
> 13/04/19 10:12:40 INFO mapreduce.Job: Task Id :
> attempt_1364348895095_0040_m_000002_0, Status : FAILED
> Error: java.lang.RuntimeException:
> java.lang.reflect.InvocationTargetException
>         at
> org.apache.sqoop.mapreduce.CombineFileRecordReader.initNextRecordReader(CombineFileRecordReader.java:166)
>         at
> org.apache.sqoop.mapreduce.CombineFileRecordReader.<init>(CombineFileRecordReader.java:125)
>         at
> org.apache.sqoop.mapreduce.ExportInputFormat.createRecordReader(ExportInputFormat.java:94)
>         at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:455)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:697)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:152)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:147)
> Caused by: java.lang.reflect.InvocationTargetException
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>         at
> org.apache.sqoop.mapreduce.CombineFileRecordReader.initNextRecordReader(CombineFileRecordReader.java:148)
>         ... 10 more
> Caused by: java.net.ConnectException: Call From Hadoop04/10.167.14.224 to
> Hadoop01:8020 failed on connection exception: java.net.ConnectException:
> Connection refused; For more details see:
> http://wiki.apache.org/hadoop/ConnectionRefused
>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:721)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1164)
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>         at $Proxy10.getFileInfo(Unknown Source)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
+
YouPeng Yang 2013-04-19, 06:14
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB