Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop, mail # user - Import data to HDFS using Sqoop2


Copy link to this message
-
Re: Import data to HDFS using Sqoop2
Abraham Elmahrek 2013-09-13, 18:09
Yanting,

I'm not sure that "text" is partitionable. That might be the problem.

-Abe
On Tue, Sep 10, 2013 at 2:19 AM, Yanting Chen <[EMAIL PROTECTED]>wrote:

> Hi Abraham
>
> Please ignore my last post containing table schema in Oracle.
>
> The following is the schema I use to create table ds_msg_log in PostgreSQL
>
> CREATE TABLE "public"."ds_msg_log" (
> "message_id" text NOT NULL,
> "login_id" text,
> "acpt_dts" timestamp(6),
> "dlvr_dts" timestamp(6),
> "sender_id" text,
> "sender_vac_id" text,
> "receiver_id" text,
> "receiver_vac_id" text,
> "status" text,
> "message_type" text,
> "flow_type" text,
> "service_type" text,
> "source_file_name" text,
> "archive_file_name" text,
> "archive_char_count" numeric(32),
> "decrypt_file_name" text,
> "decrypt_char_count" numeric(32),
> "resp_file_name" text,
> "resp_char_count" numeric(32),
> "resp_flag" text,
> "rtg_seq" text,
> "resent_flag" text DEFAULT 'N'::character varying,
> "total_inv_count" numeric(32),
> CONSTRAINT "ds_msg_log_pkey" PRIMARY KEY ("message_id")
> )
> WITH (OIDS=FALSE)
> ;
>
>
> On Tue, Sep 10, 2013 at 12:22 AM, Abraham Elmahrek <[EMAIL PROTECTED]>wrote:
>
>> Yanting,
>>
>> This can't be the schema for postgresql. There are datatypes that
>> postgresql doesn't support here. Could you provide the schema from
>> postgresql?
>>
>> -Abe
>>
>>
>> On Mon, Sep 9, 2013 at 12:18 AM, Yanting Chen <[EMAIL PROTECTED]>wrote:
>>
>>> Hi Mengwei
>>>
>>> After seeing my table schema, do you have any advanced idea about this
>>> problem?
>>>
>>>
>>>
>>>
>>> On Fri, Sep 6, 2013 at 9:02 AM, Yanting Chen <[EMAIL PROTECTED]>wrote:
>>>
>>>> TABLE DS_MSG_LOG
>>>>     (
>>>>         MESSAGE_ID VARCHAR2(23) NOT NULL,
>>>>         LOGIN_ID VARCHAR2(30),
>>>>         ACPT_DTS TIMESTAMP(6),
>>>>         DLVR_DTS TIMESTAMP(6),
>>>>         SENDER_ID VARCHAR2(30),
>>>>         SENDER_VAC_ID VARCHAR2(39),
>>>>         RECEIVER_ID VARCHAR2(30),
>>>>         RECEIVER_VAC_ID VARCHAR2(39),
>>>>         STATUS VARCHAR2(1),
>>>>         MESSAGE_TYPE VARCHAR2(8),
>>>>         FLOW_TYPE VARCHAR2(5),
>>>>         SERVICE_TYPE VARCHAR2(1),
>>>>         SOURCE_FILE_NAME VARCHAR2(150),
>>>>         ARCHIVE_FILE_NAME VARCHAR2(250),
>>>>         ARCHIVE_CHAR_COUNT NUMBER,
>>>>         DECRYPT_FILE_NAME VARCHAR2(250),
>>>>         DECRYPT_CHAR_COUNT NUMBER,
>>>>         RESP_FILE_NAME VARCHAR2(250),
>>>>         RESP_CHAR_COUNT NUMBER,
>>>>         RESP_FLAG VARCHAR2(1),
>>>>         RTG_SEQ VARCHAR2(8),
>>>>         RESENT_FLAG VARCHAR2(1) DEFAULT 'N',
>>>>         TOTAL_INV_COUNT NUMBER,
>>>>         CONSTRAINT PK_DS_MSG_LOG PRIMARY KEY (MESSAGE_ID)
>>>>     )
>>>>
>>>>
>>>> On Fri, Sep 6, 2013 at 1:55 AM, Abraham Elmahrek <[EMAIL PROTECTED]>wrote:
>>>>
>>>>> Could you provide your schema from PostGreSQL? Mengwei is likely
>>>>> right.
>>>>>
>>>>>
>>>>> On Wed, Sep 4, 2013 at 7:36 PM, Yanting Chen <[EMAIL PROTECTED]>wrote:
>>>>>
>>>>>> Actually the schema comes from Oracle. However, I try to modify it to
>>>>>> let it fit into PostgreSQL database. So, now the database I use is
>>>>>> PostgreSQL.
>>>>>>
>>>>>>
>>>>>> On Thu, Sep 5, 2013 at 10:33 AM, Abraham Elmahrek <[EMAIL PROTECTED]>wrote:
>>>>>>
>>>>>>> Yanting,
>>>>>>>
>>>>>>> Also, it seems like the schema you've provided is for an Oracle
>>>>>>> database. i.e. VARCHAR2 and NUMBER are datatypes specific to Oracle. Could
>>>>>>> you please use an oracle connection string and driver? i.e. oracle.jdbc.driver.OracleDriver
>>>>>>> and jdbc:oracle:thin:@host:port:SID.
>>>>>>>
>>>>>>> -abe
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Sep 4, 2013 at 7:32 PM, Mengwei Ding <[EMAIL PROTECTED]
>>>>>>> > wrote:
>>>>>>>
>>>>>>>> Hmm... would you mind showing us your most updated job
>>>>>>>> configuration by typing "show job --jid 3"? I just want to make sure that
>>>>>>>> you provide the partition column correctly.
>>>>>>>>
>>>>>>>> Also, I notice that the primary key for this table is "VARCHAR(23)"
>>>>>>>> type, this might be the problem.
>>>>>>>>