Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive, mail # user - create table exception


Copy link to this message
-
Re: create table exception
Arvind Prabhakar 2010-04-05, 20:51
Hi Sagar,

> As a trial, I  am trying to setup hive for local DFS,MR mode

You can do this as follows:

1. Set your HADOOP_HOME to the local hadoop installation. The configuration
files - core-site.xml, mapred-site.xml, and hdfs-site.xml in
$HADOOP_HOME/conf should be empty configurations with no properties
specified.

2. In your HIVE_HOME/conf directory - create a file called hive-site.xml and
specify the javax.jdo.option.ConnectionURL
as jdbc:derby:;databaseName=/path/to/your/metastore;create=true.

Now you should be able to create the metastore database locally where you
want and your dfs and map-reduce systems should run locally.

It is my understanding that this should be the default setup anyway for both
hadoop and hive - except for the custom connection URL for the metastore. If
you were not to specify the custom connection URL, the metastore would be
created in ${PWD}/metastore_db, which is also local/embedded.

Arvind

On Mon, Apr 5, 2010 at 11:55 AM, Sagar Naik <[EMAIL PROTECTED]> wrote:

> Hi
> I tried to setup in embedded mode, the easiest one :)
> Still no luck
>
> <property>
>  <name>mapred.reduce.tasks</name>
>  <value>local</value>
>    <description>The default number of reduce tasks per job.  Typically set
>  to a prime close to the number of available hosts.  Ignored when
>  mapred.job.tracker is "local". Hadoop set this to 1 by default, whereas
> hive uses -1 as its default value.
>  By setting this property to -1, Hive will automatically figure out what
> should be the number of reducers.
>  </description>
> </property>
>
> <property>
>  <name>fs.default.name</name>
>  <value>namenode:54310</value>
> </property>
>
> <property>
>  <name>javax.jdo.option.ConnectionURL</name>
>
>  <value>jdbc:derby:;databaseName=/data/hive/hive_metastore_db;create=true</value>
> </property>
>
>
> <property>
>  <name>javax.jdo.option.ConnectionDriverName</name>
>  <value>org.apache.derby.jdbc.EmbeddedDriver</value>
> </property>
>
>
> <property>
>  <name>hive.metastore.warehouse.dir</name>
>  <value>file:///data/hive/warehouse</value>
> </property>
>
>
> <property>
>  <name>hive.metastore.local</name>
>  <value>true</value>
> </property>
>
> I made sure tht hive-site.xml is in  classpath
>
>
> bin/hive
> hive-log4j.properties not found
> Hive history file=/tmp/argus/hive_job_log_argus_201004051154_330230103.txt
> 10/04/05 11:54:06 [main] INFO exec.HiveHistory: Hive history
> file=/tmp/argus/hive_job_log_argus_201004051154_330230103.txt
> hive> CREATE TABLE pokes (foo INT, bar STRING);
> 10/04/05 11:54:10 [main] INFO parse.ParseDriver: Parsing command: CREATE
> TABLE pokes (foo INT, bar STRING)
> 10/04/05 11:54:10 [main] INFO parse.ParseDriver: Parse Completed
> 10/04/05 11:54:10 [main] INFO parse.SemanticAnalyzer: Starting Semantic
> Analysis
> 10/04/05 11:54:10 [main] INFO parse.SemanticAnalyzer: Creating tablepokes
> positin=13
> 10/04/05 11:54:10 [main] INFO ql.Driver: Semantic Analysis Completed
> 10/04/05 11:54:10 [main] INFO ql.Driver: Starting command: CREATE TABLE
> pokes (foo INT, bar STRING)
> 10/04/05 11:54:10 [main] INFO exec.DDLTask: Default to LazySimpleSerDe for
> table pokes
> 10/04/05 11:54:10 [main] INFO hive.log: DDL: struct pokes { i32 foo, string
> bar}
> FAILED: Error in metadata: java.lang.IllegalArgumentException: URI:  does
> not have a scheme
> 10/04/05 11:54:11 [main] ERROR exec.DDLTask: FAILED: Error in metadata:
> java.lang.IllegalArgumentException: URI:  does not have a scheme
> org.apache.hadoop.hive.ql.metadata.HiveException:
> java.lang.IllegalArgumentException: URI:  does not have a scheme
>        at
> org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:281)
>        at
> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:1281)
>        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:119)
>        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:99)
>        at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:64)
>        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:582)