Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive >> mail # user >> Hive Metadata URI error


Copy link to this message
-
Re: Hive Metadata URI error
Sam: I added "file://". Now it looks like this:
<value>file:///home/users/jtv/CDH3/hive/conf/metastore_db</value>

The problem has not gone away. I still have the same problem. I tried
rebooting my ec-2 instance. Still no difference.

What does it mean by "does not have a scheme". What is it expecting?

Thanks,
PD.

On Sun, Dec 11, 2011 at 9:15 PM, Sam Wilson <[EMAIL PROTECTED]> wrote:

> Try file:// in front of the property value...
>
> Sent from my iPhone
>
> On Dec 12, 2011, at 12:07 AM, "Periya.Data" <[EMAIL PROTECTED]> wrote:
>
> Hi,
>    I am trying to create Hive tables on an EC2 instance. I get this
> strange error about URI schema and log4j properties not found. I do not
> know how to fix this.
>
> On EC2 instance : Ubuntu 10.04, Hive-0.7.1-cdh3u2.
>
> Initially I did not have an entry for hive.metastore.uris property in my
> hive-default.xml file. So, I created one.  Still, I get the errors as
> pasted below. I was under the assumption that if we leave the uris value
> blank, it is will assume the local metastore.
>
> <property>
>   <name>hive.metastore.local</
> name>
>   <value>true</value>
>   <description>controls whether to connect to remove metastore server or
> open a new metastore server in Hive Client JVM</description>
> </property>
>
> <property>
>   <name>hive.metastore.uris</name>
>   <value>/home/users/jtv/CDH3/hive/conf/metastore_db</value>
> </property>
>
>
> root@ip-10-114-18-63:/home/users/jtv# hive -f ./scripts/log25.q
> hive-log4j.properties not found
> Hive history file=/tmp/root/hive_job_log_root_201112120332_1795396613.txt
> 11/12/12 03:32:03 INFO exec.HiveHistory: Hive history
> file=/tmp/root/hive_job_log_root_201112120332_1795396613.txt
> 11/12/12 03:32:03 INFO parse.ParseDriver: Parsing command: CREATE TABLE
> log25_tbl (OperationEvent STRING, HostIP STRING, StartTime STRING,
> SourceRepo STRING, SourceFolder STRING, DestRepo STRING, DestFolder STRING,
> EntityOrObject STRING, BytesSent STRING, TotalTimeInSecs STRING) COMMENT
> 'This is the Log_25 Table'
> 11/12/12 03:32:04 INFO parse.ParseDriver: Parse Completed
> 11/12/12 03:32:04 INFO parse.SemanticAnalyzer: Starting Semantic Analysis
> 11/12/12 03:32:04 INFO parse.SemanticAnalyzer: Creating table log25_tbl
> position=13
> 11/12/12 03:32:04 INFO ql.Driver: Semantic Analysis Completed
> 11/12/12 03:32:04 INFO ql.Driver: Returning Hive schema:
> Schema(fieldSchemas:null, properties:null)
> 11/12/12 03:32:04 INFO ql.Driver: Starting command: CREATE TABLE log25_tbl
> (OperationEvent STRING, HostIP STRING, StartTime STRING, SourceRepo STRING,
> SourceFolder STRING, DestRepo STRING, DestFolder STRING, EntityOrObject
> STRING, BytesSent STRING, TotalTimeInSecs STRING) COMMENT 'This is the
> Log_25 Table'
> 11/12/12 03:32:04 INFO exec.DDLTask: Default to LazySimpleSerDe for table
> log25_tbl
> 11/12/12 03:32:04 INFO hive.log: DDL: struct log25_tbl { string
> operationevent, string hostip, string starttime, string sourcerepo, string
> sourcefolder, string destrepo, string destfolder, string entityorobject,
> string bytessent, string totaltimeinsecs}
> FAILED: Error in metadata: java.lang.IllegalArgumentException: URI:  does
> not have a scheme
> 11/12/12 03:32:04 ERROR exec.DDLTask: FAILED: Error in metadata:
> java.lang.IllegalArgumentException: URI:  does not have a scheme
> org.apache.hadoop.hive.ql.metadata.HiveException:
> java.lang.IllegalArgumentException: URI:  does not have a scheme
>     at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:476)
>     at
> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3176)
>     at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:213)
>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130)
>     at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>     at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063)
>     at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900)
>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748)
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB