Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive >> mail # user >> create table exception


Copy link to this message
-
Re: create table exception
I referred to
http://www.mazsoft.com/blog/post/2010/02/01/Setting-up-HadoopHive-to-use-MySQL-as-metastore.aspxwhen
I setup metastore.

You can try that approach as well.

On Mon, Apr 5, 2010 at 11:55 AM, Sagar Naik <[EMAIL PROTECTED]> wrote:

> Hi
> I tried to setup in embedded mode, the easiest one :)
> Still no luck
>
> <property>
>  <name>mapred.reduce.tasks</name>
>  <value>local</value>
>    <description>The default number of reduce tasks per job.  Typically set
>  to a prime close to the number of available hosts.  Ignored when
>  mapred.job.tracker is "local". Hadoop set this to 1 by default, whereas
> hive uses -1 as its default value.
>  By setting this property to -1, Hive will automatically figure out what
> should be the number of reducers.
>  </description>
> </property>
>
> <property>
>  <name>fs.default.name</name>
>  <value>namenode:54310</value>
> </property>
>
> <property>
>  <name>javax.jdo.option.ConnectionURL</name>
>
>  <value>jdbc:derby:;databaseName=/data/hive/hive_metastore_db;create=true</value>
> </property>
>
>
> <property>
>  <name>javax.jdo.option.ConnectionDriverName</name>
>  <value>org.apache.derby.jdbc.EmbeddedDriver</value>
> </property>
>
>
> <property>
>  <name>hive.metastore.warehouse.dir</name>
>  <value>file:///data/hive/warehouse</value>
> </property>
>
>
> <property>
>  <name>hive.metastore.local</name>
>  <value>true</value>
> </property>
>
> I made sure tht hive-site.xml is in  classpath
>
>
> bin/hive
> hive-log4j.properties not found
> Hive history file=/tmp/argus/hive_job_log_argus_201004051154_330230103.txt
> 10/04/05 11:54:06 [main] INFO exec.HiveHistory: Hive history
> file=/tmp/argus/hive_job_log_argus_201004051154_330230103.txt
> hive> CREATE TABLE pokes (foo INT, bar STRING);
> 10/04/05 11:54:10 [main] INFO parse.ParseDriver: Parsing command: CREATE
> TABLE pokes (foo INT, bar STRING)
> 10/04/05 11:54:10 [main] INFO parse.ParseDriver: Parse Completed
> 10/04/05 11:54:10 [main] INFO parse.SemanticAnalyzer: Starting Semantic
> Analysis
> 10/04/05 11:54:10 [main] INFO parse.SemanticAnalyzer: Creating tablepokes
> positin=13
> 10/04/05 11:54:10 [main] INFO ql.Driver: Semantic Analysis Completed
> 10/04/05 11:54:10 [main] INFO ql.Driver: Starting command: CREATE TABLE
> pokes (foo INT, bar STRING)
> 10/04/05 11:54:10 [main] INFO exec.DDLTask: Default to LazySimpleSerDe for
> table pokes
> 10/04/05 11:54:10 [main] INFO hive.log: DDL: struct pokes { i32 foo, string
> bar}
> FAILED: Error in metadata: java.lang.IllegalArgumentException: URI:  does
> not have a scheme
> 10/04/05 11:54:11 [main] ERROR exec.DDLTask: FAILED: Error in metadata:
> java.lang.IllegalArgumentException: URI:  does not have a scheme
> org.apache.hadoop.hive.ql.metadata.HiveException:
> java.lang.IllegalArgumentException: URI:  does not have a scheme
>        at
> org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:281)
>        at
> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:1281)
>        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:119)
>        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:99)
>        at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:64)
>        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:582)
>        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:462)
>        at org.apache.hadoop.hive.ql.Driver.runCommand(Driver.java:324)
>        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:312)
>        at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:123)
>        at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:181)
>        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:287)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB