Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive, mail # user - hive query fail


Copy link to this message
-
RE: hive query fail
yogesh.kumar13@... 2012-10-03, 07:43
Hi Ajit,

I have already suggested you that select command dosen't initiate map-reduce, its just dump the data.
check out your network proxy setting for you node.

try to give bypass proxy for the address.

Regards

Yogesh Kumar Dhari

________________________________
From: Ajit Kumar Shreevastava [[EMAIL PROTECTED]]
Sent: Wednesday, October 03, 2012 1:02 PM
To: [EMAIL PROTECTED]
Subject: hive query fail

Hi All,

I am using oracle as a remote metastore for hive.

Whenever I fired insert or select command on this its run successfully.
But when I want to run select count(1) from pokes; it fails.

[hadoop@NHCLT-PC44-2 ~]$ hive
Logging initialized using configuration in file:/home/hadoop/Hive/conf/hive-log4j.properties
Hive history file=/home/hadoop/tmp/hadoop/hive_job_log_hadoop_201210031257_2024792684.txt
hive> select count(1) from pokes;
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapred.reduce.tasks=<number>
Starting Job = job_201210031252_0003, Tracking URL = http://NHCLT-PC44-2:50030/jobdetails.jsp?jobid=job_201210031252_0003
Kill Command = /home/hadoop/hadoop-1.0.3/bin/hadoop job  -kill job_201210031252_0003

[hadoop@NHCLT-PC44-2 hadoop]$ cat hive.log
2012-10-03 12:57:41,327 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/home/hadoop/Hive/conf/hive-site.xml:a attempt to override final parameter: mapred.job.tracker;  Ignoring.
2012-10-03 12:57:41,726 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/home/hadoop/Hive/conf/hive-site.xml:a attempt to override final parameter: mapred.job.tracker;  Ignoring.
2012-10-03 12:57:41,738 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/home/hadoop/Hive/conf/hive-site.xml:a attempt to override final parameter: mapred.job.tracker;  Ignoring.
2012-10-03 12:57:45,629 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.
2012-10-03 12:57:45,629 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.
2012-10-03 12:57:45,629 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.
2012-10-03 12:57:45,629 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.
2012-10-03 12:57:45,630 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.
2012-10-03 12:57:45,630 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.
2012-10-03 12:57:46,321 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/home/hadoop/Hive/conf/hive-site.xml:a attempt to override final parameter: mapred.job.tracker;  Ignoring.
2012-10-03 12:57:50,024 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/tmp/hive-default-4124574712561576117.xml:a attempt to override final parameter: fs.checkpoint.dir;  Ignoring.
2012-10-03 12:57:50,025 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/home/hadoop/Hive/conf/hive-site.xml:a attempt to override final parameter: mapred.job.tracker;  Ignoring.
2012-10-03 12:57:50,570 WARN  mapred.JobClient (JobClient.java:copyAndConfigureFiles(667)) - Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
2012-10-03 12:57:50,748 WARN  snappy.LoadSnappy (LoadSnappy.java:<clinit>(46)) - Snappy native library not loaded

My hive-site.xml is :-->

[hadoop@NHCLT-PC44-2 conf]$ cat hive-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
   Licensed to the Apache Software Foundation (ASF) under one or more
   contributor license agreements.  See the NOTICE file distributed with
   this work for additional information regarding copyright ownership.
   The ASF licenses this file to You under the Apache License, Version 2.0
   (the "License"); you may not use this file except in compliance with
   the License.  You may obtain a copy of the License at

       http://www.apache.org/licenses/LICENSE-2.0

   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.

<configuration>
<property>
    <name>hive.exec.scratchdir</name>
        <value>/home/hadoop/tmp/hive-${user.name}</value>
            <description>Scratch space for Hive jobs</description>
              </property>
<property>
  <name>hive.metastore.warehouse.dir</name>
    <value>/home/hadoop/user/hive/warehouse</value>
      <description>location of default database for the warehouse</description>
</property>
<property>
  <name>hive.querylog.location</name>
      <value>/home/hadoop/tmp/${user.name}</value>
        <description>Directory where Directory where session is created in this directory. If this variable set to empty string session is created in this directory. If this variable set to empty string  </description>
</property>
<!--Hive Configuration Variables used to interact with Hadoop-->
<property>
  <name>hadoop.bin.path</name>
      <value>/home/hadoop/hadoop-1.0.3/bin/hadoop</value>
            <description>The location of hadoop script which is used to