Gabor Makrai 2013-02-04, 10:44
Bennie Schut 2013-02-04, 10:53
Gabor Makrai 2013-02-04, 10:58
Looking at the versions you might be hitting https://issues.apache.org/jira/browse/HIVE-3481 which is fixed in 0.10
On my dev machine the test runs with success :Running time: 298.952409914
This includes this patch so it's worth looking at.
From: Gabor Makrai [mailto:[EMAIL PROTECTED]]
Sent: Monday, February 04, 2013 11:58 AM
To: [EMAIL PROTECTED]
Subject: Re: Problem with Hive JDBC server
Yes, of course! I attached the code!
On Mon, Feb 4, 2013 at 11:57 AM, Gabor Makrai <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>> wrote:
Yes, of course! :) I attached the code!
On Mon, Feb 4, 2013 at 11:53 AM, Bennie Schut <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>> wrote:
Since it's small can you post the code?
From: Gabor Makrai [mailto:[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>]
Sent: Monday, February 04, 2013 11:45 AM
To: [EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>
Subject: Problem with Hive JDBC server
I'm writing you because I experienced a very strange problem which probably affects all Hive distribution.
I made a small "only main function" Java program where I'm only connecting to my Hive JDBC, and getting the list of the database tables (LIST TABLES) and closing the ResultSet, the Statement and the Connection and doing this a 1000 times. The problem is that the running Hive JDBC server does not release files and with time it will throw Exception because, it will get "Too many open files" IOException from the JVM.
I tested with Hive 0.9, 0.8.1, and the patched Hive 0.9 installed in CDH4.1.1.
If it is a know issue, than could you tell me the solution for it? If it is not, than I can create a new ticket in Jira, and with a little help, I probably can fix the problem and contribute the solution for it.
王锋 2013-02-05, 05:06
王锋 2013-02-05, 05:20
Gabor Makrai 2013-02-06, 11:44
Bennie Schut 2013-02-07, 09:10
Prasad Mujumdar 2013-02-11, 20:11