Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive >> mail # user >> ClassNotFoundException when use hive java client of hive + hbase integration


Copy link to this message
-
Re: ClassNotFoundException when use hive java client of hive + hbase integration
This doesn't work. In CLI mode you could export the environment
variable to avoid add jar every time.
I did this, but still encounter the error when I access from java client.
And I can't even specify the --auxpath param when you start a hive
thrift service.
So at least in my situation, I have to add jar by myself.

On Wed, Nov 7, 2012 at 12:31 AM, [EMAIL PROTECTED]
<[EMAIL PROTECTED]> wrote:
> FWIW, you can also drop all your needed jars (including the hbase and
> zookeeper ones) in a folder and then set this property in your hive-env.sh.
>
> export HIVE_AUX_JARS_PATH = <path to the folder>
>
> This way you need not add them manually everytime.
>
>
> On Mon, Nov 5, 2012 at 9:18 PM, Cheng Su <[EMAIL PROTECTED]> wrote:
>>
>> Mark, thank you so much for your suggestion.
>>
>> Although I've already add necessary jars to my hive aux path, thus I
>> can execute my sql in hive CLI mode without getting any error.
>> But when I use a java client to access the tables through the thrift
>> service, I need to add these jars manually.
>> I execute the "ADD JAR xxxx.jar" sql and the problem is solved!
>>
>> Thank you again!
>>
>> On Tue, Nov 6, 2012 at 9:03 AM, Mark Grover <[EMAIL PROTECTED]>
>> wrote:
>> > Cheng,
>> > You will have to add the appropriate HBase related jars to your class
>> > path.
>> >
>> > You can do that by running "add jar" command(s) or put it in aux_lib.
>> > See
>> > this thread for reference:
>> >
>> > http://mail-archives.apache.org/mod_mbox/hive-user/201103.mbox/%3CAANLkTingqLGKnQmiZgoi+[EMAIL PROTECTED]%3E
>> >
>> > Mark
>> >
>> >
>> > On Mon, Nov 5, 2012 at 6:53 AM, Cheng Su <[EMAIL PROTECTED]> wrote:
>> >>
>> >> Hi, all. I have a hive+hbase integration cluster.
>> >>
>> >> When I try to execute query through the java client of hive, sometimes
>> >> a ClassNotFoundException happens.
>> >>
>> >> My java code :
>> >>
>> >> final Connection conn = DriverManager.getConnection(URL);
>> >> final ResultSet rs = conn.executeQuery("SELECT count(*) FROM
>> >> test_table WHERE (source = '0' AND ur_createtime BETWEEN
>> >> '20121031000000' AND '20121031235959')");
>> >>
>> >> I can execute the sql:SELECT count(*) FROM test_table WHERE (source >> >> '0' AND ur_createtime BETWEEN '20121031000000' AND '20121031235959')
>> >> in hive cli mode, and get the query result, so there is no error in my
>> >> sql.
>> >>
>> >> The client side exception:
>> >>
>> >> Caused by: java.sql.SQLException: Query returned non-zero code: 9,
>> >> cause: FAILED: Execution Error, return code 2 from
>> >> org.apache.hadoop.hive.ql.exec.MapRedTask
>> >>     at
>> >>
>> >> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:189)
>> >> ... 23 more
>> >>
>> >> The server side exception(hadoop-jobtracker):
>> >>
>> >> 2012-11-05 18:55:39,443 INFO org.apache.hadoop.mapred.TaskInProgress:
>> >> Error from attempt_201210301133_0112_m_000000_3: java.io.IOException:
>> >> Cannot create an instance of InputSplit class >> >>
>> >>
>> >> org.apache.hadoop.hive.hbase.HBaseSplit:org.apache.hadoop.hive.hbase.HBaseSplit
>> >>     at
>> >>
>> >> org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:146)
>> >>     at
>> >>
>> >> org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:67)
>> >>     at
>> >>
>> >> org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:40)
>> >>     at
>> >> org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:396)
>> >>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:412)
>> >>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
>> >>     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>> >>     at java.security.AccessController.doPrivileged(Native Method)
>> >>     at javax.security.auth.Subject.doAs(Unknown Source)
>> >>     at
>> >>
>> >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
Regards,
Cheng Su
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB