Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive >> mail # user >> hive query fail


Hi All,

I am using oracle as a remote metastore for hive.

I want to run select count(1) from pokes; it fails.
Again when I want to access "http://NHCLT-PC44-2:50030/jobdetails.jsp?jobid=job_201210031252_0003" from internet explorer page cannot be displayed.

[hadoop@NHCLT-PC44-2 ~]$ hive
Logging initialized using configuration in file:/home/hadoop/Hive/conf/hive-log4j.properties
Hive history file=/home/hadoop/tmp/hadoop/hive_job_log_hadoop_201210031257_2024792684.txt
hive> select count(1) from pokes;
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapred.reduce.tasks=<number>
Starting Job = job_201210031252_0003, Tracking URL = http://NHCLT-PC44-2:50030/jobdetails.jsp?jobid=job_201210031252_0003
Kill Command = /home/hadoop/hadoop-1.0.3/bin/hadoop job  -kill job_201210031252_0003

[hadoop@NHCLT-PC44-2 hadoop]$ cat hive.log
2012-10-03 17:20:31,965 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.
2012-10-03 17:20:31,965 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.
2012-10-03 17:20:31,967 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.
2012-10-03 17:20:31,967 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.
2012-10-03 17:20:31,967 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.
2012-10-03 17:20:31,967 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.
2012-10-03 17:20:36,732 WARN  mapred.JobClient (JobClient.java:copyAndConfigureFiles(667)) - Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
2012-10-03 17:20:36,856 WARN  snappy.LoadSnappy (LoadSnappy.java:<clinit>(46)) - Snappy native library not loaded
My hive-site.xml is :-->

[hadoop@NHCLT-PC44-2 conf]$ cat hive-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
   Licensed to the Apache Software Foundation (ASF) under one or more
   contributor license agreements.  See the NOTICE file distributed with
   this work for additional information regarding copyright ownership.
   The ASF licenses this file to You under the Apache License, Version 2.0
   (the "License"); you may not use this file except in compliance with
   the License.  You may obtain a copy of the License at

       http://www.apache.org/licenses/LICENSE-2.0

   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.
-->

<configuration>
<property>
    <name>hive.exec.scratchdir</name>
        <value>/home/hadoop/tmp/hive-${user.name}</value>
            <description>Scratch space for Hive jobs</description>
              </property>
<property>
  <name>hive.metastore.warehouse.dir</name>
    <value>/home/hadoop/user/hive/warehouse</value>
      <description>location of default database for the warehouse</description>
</property>
<property>
  <name>hive.querylog.location</name>
      <value>/home/hadoop/tmp/${user.name}</value>
        <description>Directory where Directory where session is created in this directory. If this variable set to empty string session is created in this directory. If this variable set to empty string  </description>
</property>
<!--Hive Configuration Variables used to interact with Hadoop-->
<property>
  <name>hadoop.bin.path</name>
      <value>/home/hadoop/hadoop-1.0.3/bin/hadoop</value>
            <description>The location of hadoop script which is used to submit jobs to hadoop when submitting through a separate jvm.</description>
</property>
<property>
  <name>mapred.job.tracker</name>
    <value>hdfs://localhost:8021/</value>
      <description>Datanode1</description>
</property>
<property>
  <name>hadoop.config.dir</name>
  <value>/home/hadoop/hadoop-1.0.3/conf</value>
  <description>The location of the configuration directory of the hadoop installation</description>
</property>
<property>
   <name>javax.jdo.option.ConnectionURL</name>
   <value>jdbc:oracle:thin:@10.99.42.11:1521:clouddb</value>
</property>
<property>
   <name>javax.jdo.option.ConnectionDriverName</name>
   <value>oracle.jdbc.driver.OracleDriver</value>
</property>
<property>
   <name>javax.jdo.option.ConnectionUserName</name>
  <value>hiveuser</value>
</property>
<property>
     <name>javax.jdo.option.ConnectionPassword</name>
     <value>hiveuser</value>
</property>
</configuration>

Thanks and Regards
Ajit Kumar Shreevastava
ADCOE (App Development Center Of Excellence )
Mobile: 9717775634

The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the author and may