Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> Import data to HDFS using Sqoop2


Copy link to this message
-
Re: Import data to HDFS using Sqoop2
Yanting,

Also, it seems like the schema you've provided is for an Oracle database.
i.e. VARCHAR2 and NUMBER are datatypes specific to Oracle. Could you please
use an oracle connection string and driver? i.e.
oracle.jdbc.driver.OracleDriver
and jdbc:oracle:thin:@host:port:SID.

-abe
On Wed, Sep 4, 2013 at 7:32 PM, Mengwei Ding <[EMAIL PROTECTED]> wrote:

> Hmm... would you mind showing us your most updated job configuration by
> typing "show job --jid 3"? I just want to make sure that you provide the
> partition column correctly.
>
> Also, I notice that the primary key for this table is "VARCHAR(23)" type,
> this might be the problem.
>
> Mengwei
>
>
> On Wed, Sep 4, 2013 at 10:23 PM, Yanting Chen <[EMAIL PROTECTED]>wrote:
>
>> Hi Mengwei
>>
>> I try to set primary key into partition column. But still the same error!
>>
>>
>> On Thu, Sep 5, 2013 at 10:17 AM, Mengwei Ding <[EMAIL PROTECTED]>wrote:
>>
>>> Hi Yanting,
>>>
>>> It seems like you did not specify the 'partition column' for the job.
>>> Generally, the primary key of  the table is a good choice for 'parition
>>> column'.
>>>
>>> You could use 'update job -jid 3' to update the job configuration.
>>>
>>> Mengwei
>>>
>>>
>>> On Wed, Sep 4, 2013 at 9:37 PM, Yanting Chen <[EMAIL PROTECTED]>wrote:
>>>
>>>> Abraham,
>>>>
>>>> Thanks for you answer. I review my database.
>>>> I think the database name is invoice and the schema name is public,
>>>> just like the picture below.
>>>> http://imgur.com/ns0iNLi
>>>>
>>>> So, I change the schema name with a new value, "public"
>>>> Then, I run this job and get a different error.
>>>>
>>>> Status: FAILURE_ON_SUBMIT
>>>> Creation date: 2013-09-05 09:30:44 CST
>>>> Last update date: 2013-09-05 09:30:44 CST
>>>> Exception: org.apache.sqoop.common.SqoopException:
>>>> GENERIC_JDBC_CONNECTOR_0011:The type is not supported - 12
>>>> Stack trace: org.apache.sqoop.common.SqoopException:
>>>> GENERIC_JDBC_CONNECTOR_0011:The type is not supported - 12
>>>> at
>>>> org.apache.sqoop.connector.jdbc.GenericJdbcImportPartitioner.getPartitions(GenericJdbcImportPartitioner.java:87)
>>>>  at
>>>> org.apache.sqoop.connector.jdbc.GenericJdbcImportPartitioner.getPartitions(GenericJdbcImportPartitioner.java:32)
>>>> at
>>>> org.apache.sqoop.job.mr.SqoopInputFormat.getSplits(SqoopInputFormat.java:71)
>>>>  at
>>>> org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:452)
>>>> at
>>>> org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:469)
>>>>  at
>>>> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:366)
>>>> at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1269)
>>>>  at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1266)
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>  at javax.security.auth.Subject.doAs(Subject.java:415)
>>>> at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>>>>  at org.apache.hadoop.mapreduce.Job.submit(Job.java:1266)
>>>> at
>>>> org.apache.sqoop.submission.mapreduce.MapreduceSubmissionEngine.submit(MapreduceSubmissionEngine.java:265)
>>>>  at
>>>> org.apache.sqoop.framework.FrameworkManager.submit(FrameworkManager.java:480)
>>>> at
>>>> org.apache.sqoop.handler.SubmissionRequestHandler.submissionSubmit(SubmissionRequestHandler.java:112)
>>>>  at
>>>> org.apache.sqoop.handler.SubmissionRequestHandler.handleActionEvent(SubmissionRequestHandler.java:98)
>>>> at
>>>> org.apache.sqoop.handler.SubmissionRequestHandler.handleEvent(SubmissionRequestHandler.java:68)
>>>>  at
>>>> org.apache.sqoop.server.v1.SubmissionServlet.handlePostRequest(SubmissionServlet.java:44)
>>>> at
>>>> org.apache.sqoop.server.SqoopProtocolServlet.doPost(SqoopProtocolServlet.java:63)
>>>>  at javax.servlet.http.HttpServlet.service(HttpServlet.java:637)
>>>> at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)
>>>>  at
>>>> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB