Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Sqoop, mail # user - Getting exception on hive sqoop import using "--hive-import" in Oozie


+
Nitin kak 2013-01-31, 01:21
+
Jarek Jarcec Cecho 2013-01-31, 16:13
Copy link to this message
-
Re: Getting exception on hive sqoop import using "--hive-import" in Oozie
Kathleen Ting 2013-01-31, 02:11
Hi Nitin,

Separating spaces is not needed - can you try this instead?

<arg>import --connect ${SWF_SYNC_TRANS_SOURCE_JDBC_CONNECTION_URL}
--username ${SWF_SYNC_TRANS_SOURCE_HOST_USERNAME} --password
${SWF_SYNC_TRANS_SOURCE_HOST_PASSWORD} --table
${SWF_SYNC_TRANS_SOURCE_SYNC_OBJECT_NAME} --where
instance_id=${SWF_SYNC_TRANS_SOURCE_SYNC_DATASET_INSTANCE_ID}
--hive-table ${SWF_SYNC_TRANS_DESTINATION_SYNC_OBJECT_NAME}_tmp_${WF_WFI_ID}
--columns ${SWF_SYNC_TRANS_SOURCE_DATA_COL_LIST} --hive-import
--create-hive-table select
facility_no,max(asof_yyyymm),max(int_type_cd) as
int_typ_cd,max(app_sys_no),substr(avg(base_index_cd_rollup),1,5),avg(base_rt_plus_minus_factor),avg(nominal_bank_int_rt)
from test.loans_history_pt where asof_yyyymm = (select
max(asof_yyyymm) from test.loans_history_pt) AND \$CONDITIONS Group by
facility_no order by facility_no</arg>

Regards, Kathleen

On Wed, Jan 30, 2013 at 5:21 PM, Nitin kak <[EMAIL PROTECTED]> wrote:
> I am getting a weird exception on executing this Oozie sqoop action. Any
> clues?
>
>
>  <action name="SWF_SYNC_DTRA-SQOOP_IMPORT">
>         <sqoop xmlns="uri:oozie:sqoop-action:0.2">
>             <job-tracker>${jobTracker}</job-tracker>
>             <name-node>${nameNode}</name-node>
>      <configuration>
>               <property>
>                 <name>sqoop.connection.factories</name>
>
> <value>com.cloudera.sqoop.manager.NetezzaManagerFactory</value>
>               </property>
> <property>
>                 <name>oozie.hive.defaults</name>
>                 <value>${WF_HIVESITE_PATH}</value>
>             </property>
>
>             </configuration>
>     <arg>import</arg>
>             <arg>--connect</arg>
>             <arg>${SWF_SYNC_TRANS_SOURCE_JDBC_CONNECTION_URL}</arg>
>     <arg>--username</arg>
>             <arg>${SWF_SYNC_TRANS_SOURCE_HOST_USERNAME}</arg>
>     <arg>--password</arg>
>             <arg>${SWF_SYNC_TRANS_SOURCE_HOST_PASSWORD}</arg>
>     <arg>--table</arg>
>     <arg>${SWF_SYNC_TRANS_SOURCE_SYNC_OBJECT_NAME}</arg>
>     <arg>--where</arg>
>     <arg>instance_id=${SWF_SYNC_TRANS_SOURCE_SYNC_DATASET_INSTANCE_ID}</arg>
>             <arg>--hive-table</arg>
>
> <arg>${SWF_SYNC_TRANS_DESTINATION_SYNC_OBJECT_NAME}_tmp_${WF_WFI_ID}</arg>
>     <arg>--columns</arg>
>     <arg>${SWF_SYNC_TRANS_SOURCE_DATA_COL_LIST}</arg>
>     <arg>--hive-import</arg>
>              <arg>--create-hive-table</arg>
>
>         </sqoop>
>         <ok to="SWF_SYNC_DTRA-LOAD_SYNC_TABLE"/>
>         <error to="SWF_SYNC_DTRA-LOGEVENT_ERROR"/>
>    </action>
>
>
> Here is the stack trace.
>
>
> 148363 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  -
> org.apache.hadoop.hive.ql.metadata.HiveException:
> javax.jdo.JDOFatalDataStoreException: Cannot get a connection, pool error
> Could not create a validated object, cause: A read-only user or a user in a
> read-only database is not permitted to disable read-only mode on a
> connection.
> 148363 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  -
> NestedThrowables:
> 148363 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  -
> org.apache.commons.dbcp.SQLNestedException: Cannot get a connection, pool
> error Could not create a validated object, cause: A read-only user or a user
> in a read-only database is not permitted to disable read-only mode on a
> connection.
> 148363 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  - at
> org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:991)
> 148363 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  - at
> org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:976)
> 148363 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  - at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:7852)
> 148363 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  - at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:7251)
> 148364 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  - at
+
Nitin kak 2013-01-31, 04:00