Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive, mail # user - cli timeouts


Copy link to this message
-
Re: cli timeouts
Travis Crawford 2012-08-01, 19:02
Interesting - this issue would certainly go away with local mode as
there's no thrift call to fail. I'd very much prefer to run HMS as a
centralized service though.

Thanks for the info - I'll have to take a look at how the thrift
client handles timeouts/reconnects/etc.

--travis
On Wed, Aug 1, 2012 at 11:57 AM, Edward Capriolo <[EMAIL PROTECTED]> wrote:
> The two setup options are:
>
> cli->thriftmetastore->jdbc
>
> cli->jdbc (used to be called local mode)
>
> localmode has less moving parts so I prefer it.
>
> On Wed, Aug 1, 2012 at 2:54 PM, Travis Crawford
> <[EMAIL PROTECTED]> wrote:
>> Oh interesting - you're saying instead of running a single
>> HiveMetaStore thrift service, most users use the embedded
>> HiveMetaStore mode and have each CLI instance connect to the DB
>> directly?
>>
>> --travis
>>
>>
>> On Wed, Aug 1, 2012 at 11:47 AM, Edward Capriolo <[EMAIL PROTECTED]> wrote:
>>> I feel that that interface is very rarely used in the wild. The only
>>> use case I can figure out for it is people with very in depth hive
>>> experience that do not wish to interact with hive through the QL
>>> language. That being said I would think the coverage might be a little
>>> weak there. With the local metastore users have data nucleus providing
>>> support for reconnection etc.
>>>
>>> On Wed, Aug 1, 2012 at 2:35 PM, Travis Crawford
>>> <[EMAIL PROTECTED]> wrote:
>>>> I'm using the thrift metastore via TFramedTransport. What value do you
>>>> specify for hive.metastore.client.socket.timeout? I'm using 60.
>>>>
>>>> If I open the CLI, run "show tables", wait the timeout period, then
>>>> run "show tables" the CLI hangs in:
>>>>
>>>> "main" prio=10 tid=0x000000004151a000 nid=0x448 runnable [0x0000000041b42000]
>>>>    java.lang.Thread.State: RUNNABLE
>>>>         at java.net.SocketInputStream.socketRead0(Native Method)
>>>>         at java.net.SocketInputStream.read(SocketInputStream.java:129)
>>>>         at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127)
>>>>         at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
>>>>         at org.apache.thrift.transport.TFramedTransport.readFrame(TFramedTransport.java:129)
>>>>         at org.apache.thrift.transport.TFramedTransport.read(TFramedTransport.java:101)
>>>>         at org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
>>>>         at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:378)
>>>>         at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:297)
>>>>         at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:204)
>>>>         at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
>>>>         at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:374)
>>>>         at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:361)
>>>>         at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:705)
>>>>         at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1077)
>>>>         at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1066)
>>>>         at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2004)
>>>>         at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:325)
>>>>         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:134)
>>>>         at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>>>>         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1329)
>>>>         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1115)
>>>>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:948)
>>>>         at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
>>>>         at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)