I'm reluctant to answer this as I am very far from knowledgeable about this.
However the error that you're getting looks the one that I always get when I
do not have a hive server running. I think that your hcat commands can be
serviced without a hive server, but that your failing command cannot.

In other words you need to execute something like this first:

$ nohup hive --service hiveserver &

Also the URL that you are using looks a little strange.
Normally you would have something like

$ curl http://localhost:50111/templeton/v1/ddl/database/default?user.name=<myname><http://localhost:50111/templeton/v1/ddl/database/default?user.name=%3cmyname%3e>

so your URL suggests that you have a database called "testtable"?

I really hope that this helps.


Peter Marron
Senior Developer
Trillium Software, A Harte Hanks Company
Theale Court, 1st Floor, 11-13 High Street
+44 (0) 118 940 7609 office
+44 (0) 118 940 7699 fax
trilliumsoftware.com<http://www.trilliumsoftware.com/> / linkedin<http://www.linkedin.com/company/17710> / twitter<https://twitter.com/trilliumsw> / facebook<http://www.facebook.com/HarteHanks>

From: Adam Silberstein [mailto:[EMAIL PROTECTED]]
Sent: 17 March 2014 22:13
Subject: Re: org.apache.hadoop.hive.metastore.HiveMetaStoreClient with webhcat REST

Didn't get any answers on this, trying one more time.


On Mar 14, 2014, at 9:50 AM, Adam Silberstein <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>> wrote:
I'm testing out the REST interface to webhcat and stuck doing basic DDL operations.

Background on installation:

I successfully have created and loaded tables with chat command line.  E.g.: hcat -e 'create table testtable (a string, b string, c int);'
I did some loading by hand and via Pig.  So I think the HCatalog service is running correctly.

Here's what I am seeing via curl:
$ curl http://localhost:50111/templeton/v1/ddl/database/testtable?user.name=<myname><http://localhost:50111/templeton/v1/ddl/database/testtable?user.name=%3Cmyname%3E>

{"errorDetail":"\norg.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient\n\tat org.apache.hadoop.hive.ql.exec.DDLTask.descDatabase(DDLTask.java:2647)\n\tat org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:244)\n\tat org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)\n\tat org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:66)\n\tat org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1383)\n\tat org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1169)\n\tat org.apache.hadoop.hive.ql.Driver.run(Driver.java:982)\n\tat org.apache.hadoop.hive.ql.Driver.run(Driver.java:902)\n\tat org.apache.hcatalog.cli.HCatDriver.run(HCatDriver.java:43)\n\tat org.apache.hcatalog.cli.HCatCli.processCmd(HCatCli.java:251)\n\tat org.apache.hcatalog.cli.HCatCli.processLine(HCatCli.java:205)\n\tat org.apache.hcatalog.cli.HCatCli.main(HCatCli.java:164)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)\n\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat java.lang.reflect.Method.invoke(Method.java:622)\n\tat org.apache.hadoop.util.RunJar.main(RunJar.java:208)\n","error":"FAILED: Error in metadata: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient","errorCode":500}

I found a bunch of help online so I tried adding some things in webhcat-site.xml
-templeton.libjars: I added paths to a bunch of libraries, including '/usr/lib/hive/lib/hive-metastore-0.10.0-cdh4.5.0.jar' which has the missing HiveMetaStoreClient class.  Then restarted webhcat, can't tell if it picked up this property.
-Less promising, I uploaded the hive tar.gz file to HDFS and updated their paths in templeton.hive.archive and templeton.hive.path.  Skeptical the server would be looking in HDFS for libraries, and this didn't help either.
-There is older material from ~2011 as well, but ignoring that.

If you have any suggestions please share.  Thanks in advance!


We've moved! Please update your address book:

Trifacta Inc
575 Market St, 11th Floor
San Francisco, CA 94105

NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB