Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Sqoop >> mail # user >> import from Oracle to Hive : 2 errors


Copy link to this message
-
Re: import from Oracle to Hive : 2 errors
Hi Venkat,

thanks for  your  response.

i realise that my question was so stupid =).

i have rewrited the sqoop script with --hive-import option like this :

sqoop import --connect jdbc:oracle:thin:@xx.xx.xx.xx:1521/D_BI --username
visiteur --password visiteur --hive-import --create-hive-table --table
DT_PILOTAGE.DEMARQUE_MAG_JOUR --where "ROWNUM <= 1000" --hive-table
default.DEMARQUE_MAG_JOUR

But i have a new error :

13/06/18 15:59:33 ERROR tool.ImportTool: Encountered IOException running
import job: java.io.IOException: Hive exited with status 1
        at
org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:364)
        at
org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:314)
        at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:226)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:415)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:238)

In the Hive logs, we can read these informations :

Caused by: java.sql.SQLException: Failed to start database 'metastore_db',
see the next exception for details.
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
Source)
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
Source)
    ... 85 more
Caused by: java.sql.SQLException: Another instance of Derby may have
already booted the database /home/hduser/metastore_db.
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
Source)
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
Source)
    at
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown
Source)
    at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
Source)
    ... 82 more

Thanks.

Jérôme
2013/6/18 Venkat <[EMAIL PROTECTED]>

> Hi Jerome
>
> You can see the following message in your output.
>
> 13/06/18 12:05:21 WARN tool.BaseSqoopTool: It seems that you've specified
> at least one of following:
> 13/06/18 12:05:21 WARN tool.BaseSqoopTool:      --hive-home
> 13/06/18 12:05:21 WARN tool.BaseSqoopTool:      --hive-overwrite
> 13/06/18 12:05:21 WARN tool.BaseSqoopTool:      --create-hive-table
> 13/06/18 12:05:21 WARN tool.BaseSqoopTool:      --hive-table
> 13/06/18 12:05:21 WARN tool.BaseSqoopTool:      --hive-partition-key
> 13/06/18 12:05:21 WARN tool.BaseSqoopTool:      --hive-partition-value
> 13/06/18 12:05:21 WARN tool.BaseSqoopTool:      --map-column-hive
> --create-hive-table without --hive-import does not have any effect as the
> warning message showed.  So, you may want to add --hive-import to the
> command line which is the enabler of hive imports.   These additional
> options take effect only if you have enabled hive imports.
>
> THanks
>
> Venkat
>
>
> On Tue, Jun 18, 2013 at 3:36 AM, Jérôme Verdier <
> [EMAIL PROTECTED]> wrote:
>
>> Hi Jarcec,
>>
>> Thanks for your explanations, it help me understand how Sqoop works.
>>
>> i'm trying import 1000 Rows for a quite Oracle big table which is divided
>> in partitions to keep reasonable query time.
>>
>> i am using this Sqoop script, with a query to select only the first 1000
>> rows :
>>
>> sqoop import --connect jdbc:oracle:thin:@xx.xx.xx.xx:1521/D_BI
>> --username xx --password xx --create-hive-table --query 'SELECT * FROM
>> DT_PILOTAGE.DEMARQUE_MAG_JOUR WHERE ROWNUM<1000 AND $CONDITIONS'
>> --target-dir /home/hduser --split-by DEMARQUE_MAG_JOUR.CO_SOCIETE
>> --hive-table default.DEMARQUE_MAG_JOUR
>>
>> the M/R job is working quite, but as we can see in the result below,
>> datas are not moved to Hive.
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB