Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> Getting exception on hive sqoop import using "--hive-import" in Oozie


Copy link to this message
-
Re: Getting exception on hive sqoop import using "--hive-import" in Oozie
Hi Nitin,
using hive import functionality from withing oozie is not supported.

The supported and recommended workaround is to split your Sqoop action into two actions instead:

* Sqoop action that will import data from your database to HDFS
* Hive action that will load data imported in previous action

Jarcec

On Wed, Jan 30, 2013 at 08:21:08PM -0500, Nitin kak wrote:
> I am getting a weird exception on executing this Oozie sqoop action. Any
> clues?
>
>
>  <action name="SWF_SYNC_DTRA-SQOOP_IMPORT">
>         <sqoop xmlns="uri:oozie:sqoop-action:0.2">
>             <job-tracker>${jobTracker}</job-tracker>
>             <name-node>${nameNode}</name-node>
>      <configuration>
>               <property>
>                 <name>sqoop.connection.factories</name>
>
> <value>com.cloudera.sqoop.manager.NetezzaManagerFactory</value>
>               </property>
> <property>
>                 <name>oozie.hive.defaults</name>
>                 <value>${WF_HIVESITE_PATH}</value>
>             </property>
>
>             </configuration>
>      <arg>import</arg>
>             <arg>--connect</arg>
>             <arg>${SWF_SYNC_TRANS_SOURCE_JDBC_CONNECTION_URL}</arg>
>      <arg>--username</arg>
>             <arg>${SWF_SYNC_TRANS_SOURCE_HOST_USERNAME}</arg>
>      <arg>--password</arg>
>             <arg>${SWF_SYNC_TRANS_SOURCE_HOST_PASSWORD}</arg>
>      <arg>--table</arg>
>      <arg>${SWF_SYNC_TRANS_SOURCE_SYNC_OBJECT_NAME}</arg>
>      <arg>--where</arg>
>
> <arg>instance_id=${SWF_SYNC_TRANS_SOURCE_SYNC_DATASET_INSTANCE_ID}</arg>
>             <arg>--hive-table</arg>
>
> <arg>${SWF_SYNC_TRANS_DESTINATION_SYNC_OBJECT_NAME}_tmp_${WF_WFI_ID}</arg>
>      <arg>--columns</arg>
>      <arg>${SWF_SYNC_TRANS_SOURCE_DATA_COL_LIST}</arg>
>      <arg>--hive-import</arg>
>              <arg>--create-hive-table</arg>
>
>         </sqoop>
>         <ok to="SWF_SYNC_DTRA-LOAD_SYNC_TABLE"/>
>         <error to="SWF_SYNC_DTRA-LOGEVENT_ERROR"/>
>    </action>
>
>
> Here is the stack trace.
>
>
> 148363 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  -
> org.apache.hadoop.hive.ql.metadata.HiveException:
> javax.jdo.JDOFatalDataStoreException: Cannot get a connection, pool
> error Could not create a validated object, cause: A read-only user or
> a user in a read-only database is not permitted to disable read-only
> mode on a connection.
> 148363 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  - NestedThrowables:
> 148363 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  -
> org.apache.commons.dbcp.SQLNestedException: Cannot get a connection,
> pool error Could not create a validated object, cause: A read-only
> user or a user in a read-only database is not permitted to disable
> read-only mode on a connection.
> 148363 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  - at
> org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:991)
> 148363 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  - at
> org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:976)
> 148363 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  - at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:7852)
> 148363 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  - at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:7251)
> 148364 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  - at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:243)
> 148364 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  - at
> org.apache.hadoop.hive.ql.Driver.compile(Driver.java:430)
> 148364 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  - at
> org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
> 148364 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  - at
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:889)
> 148364 [Thread-35] INFO  org.apache.sqoop.hive.HiveImport  - at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:255)
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB