Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive >> mail # user >> Error while Creating Table in Hive


Copy link to this message
-
Re: Error while Creating Table in Hive
need not worry.. i am also a student..just keep your calm..start fresh
and follow these steps -

1 - download hadoop from apache using this link -
http://apache.techartifact.com/mirror/hadoop/common/hadoop-0.20.205.0/hadoop-0.20.205.0.tar.gz

2 - untar it - right click+extract here

3 - set JAVA_HOME in your hadoop-env.sh file and save it

4 - add the properties specified in previous replies in your
core-site.xml, hdfs-site.xml and mapredif it doesn't work still i'll
send you the configured hadoop-site.xml files

5 - format HDFS

6 - start the hadoop processes

also your hosts file should look like this -

127.0.0.1 localhost
127.0.0.1 ubuntu.ubuntu-domain ubuntu

# The following lines are desirable for IPv6 capable hosts
::1     ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters

it'll work..if you further face any problem, i'll send you a
configured copy of hadoop.

Regards,
    Mohammad Tariq
On Thu, Jun 7, 2012 at 1:45 AM, Babak Bastan <[EMAIL PROTECTED]> wrote:
> I checked it but no hadoop folder :(
> yes you are right.I'm a student and I want to make a very very simple
> programm hive but untill now hmmmmmmmmm
>
>
> On Wed, Jun 6, 2012 at 10:12 PM, Babak Bastan <[EMAIL PROTECTED]> wrote:
>>
>> no one error:
>> i.e if I run this one
>>
>> hostname --fqdn
>>
>>  with the condition that I send to you :
>>
>> 127.0.0.1       localhost
>> #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
>> # The following lines are desirable for IPv6 capable hosts
>> #::1     ip6-localhost ip6-loopback
>> #fe00::0 ip6-localnet
>> #ff00::0 ip6-mcastprefix
>> #ff02::1 ip6-allnodes
>> #ff02::2 ip6-allrouters
>>
>> I get this error:
>>
>> hostname: Name or service not known
>>
>> Or in the second step by this command:
>>
>> babak@ubuntu:~/Downloads/hadoop/bin$ start-hdfs.sh
>>
>> these lines of error:
>>
>>
>> mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“ nicht
>> anlegen: Keine Berechtigung
>> starting namenode, logging to
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out
>> /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out:
>> Datei oder Verzeichnis nicht gefunden
>> head:
>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out“
>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden
>> localhost: mkdir: kann Verzeichnis
>> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine Berechtigung
>> localhost: starting datanode, logging to
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out
>> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out:
>> Datei oder Verzeichnis nicht gefunden
>> localhost: head:
>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out“
>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden
>> localhost: mkdir: kann Verzeichnis
>> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine Berechtigung
>> localhost: starting secondarynamenode, logging to
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out
>> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out:
>> Datei oder Verzeichnis nicht gefunden
>> localhost: head:
>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out“
>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden
>>
>> they said no permision to make logs in this
>> path:/home/babak/Downloads/hadoop/bin/../logs
>>
>>  and generally I cant create a table in hive and get this one:
>>
>> FAILED: Error in metadata: MetaException(message:Got exception:
>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB