I believe you are using the derby meta store and then it should be an issue with the hive configs.
Derby is trying to create a metastore at your current dir from where you are starting hive. The tables exported by sqoop would be inside HIVE_HOME and hence you are not able to see the tables from getting on to hive CLI from other locations.
To have a universal metastore db configure a specific dir in javax.jdo.option.ConnectionURL in hive-site.xml . In your conn url configure the db name as "databaseName=/home/hive/metastore_db"
Sent from remote device, Please excuse typos
From: Cyril Bogus <[EMAIL PROTECTED]>
Date: Mon, 25 Feb 2013 10:34:29
To: <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: Re: Hive queries
I do not get any errors.
It is only when I run hive and try to query the tables I imported. Let's
say I want to only get numeric tuples for a given table. I cannot find the
table (show tables; is empty) unless I go in the hive home folder and run
hive again. I would expect the state of hive to be the same everywhere I
But so far it is not the case.
On Mon, Feb 25, 2013 at 10:22 AM, Nitin Pawar <[EMAIL PROTECTED]>wrote:
> any errors you see ?
> On Mon, Feb 25, 2013 at 8:48 PM, Cyril Bogus <[EMAIL PROTECTED]> wrote:
>> Hi everyone,
>> My setup is Hadoop 1.0.4, Hive 0.9.0, Sqoop 1.4.2-hadoop 1.0.0
>> Mahout 0.7
>> I have imported tables from a remote database directly into Hive using
>> Somehow when I try to run Sqoop from Hadoop, the content
>> Hive is giving me trouble in bookkeeping of where the imported tables are
>> located. I have a Single Node setup.
>> Thank you for any answer and you can ask question if I was not specific
>> enough about my issue.
> Nitin Pawar