Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive, mail # user - Error while Creating Table in Hive


Copy link to this message
-
Re: Error while Creating Table in Hive
Mohammad Tariq 2012-06-06, 19:41
go to your HADOOP_HOME i.e your hadoop directory(that includes bin,
conf etc)..you can find logs directory there..

Regards,
    Mohammad Tariq
On Thu, Jun 7, 2012 at 1:09 AM, Babak Bastan <[EMAIL PROTECTED]> wrote:
> hoe can I get my log mohammad?
>
>
> On Wed, Jun 6, 2012 at 9:36 PM, Mohammad Tariq <[EMAIL PROTECTED]> wrote:
>>
>> could you post your logs???that would help me in understanding the
>> problem properly.
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>> On Thu, Jun 7, 2012 at 1:02 AM, Babak Bastan <[EMAIL PROTECTED]> wrote:
>> > Thank you very much mohamad for your attention.I followed the steps but
>> > the
>> > error is the same as the last time.
>> > and there is my hosts file:
>> >
>> > 127.0.0.1       localhost
>> > #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
>> >
>> >
>> > # The following lines are desirable for IPv6 capable hosts
>> >
>> > #::1     ip6-localhost ip6-loopback
>> > #fe00::0 ip6-localnet
>> > #ff00::0 ip6-mcastprefix
>> > #ff02::1 ip6-allnodes
>> > #ff02::2 ip6-allrouters
>> >
>> > but no effect :(
>> >
>> > On Wed, Jun 6, 2012 at 8:25 PM, Mohammad Tariq <[EMAIL PROTECTED]>
>> > wrote:
>> >>
>> >> also change the permissions of these directories to 777.
>> >>
>> >> Regards,
>> >>     Mohammad Tariq
>> >>
>> >>
>> >> On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq <[EMAIL PROTECTED]>
>> >> wrote:
>> >> > create a directory "/home/username/hdfs" (or at some place of your
>> >> > choice)..inside this hdfs directory create three sub directories -
>> >> > name, data, and temp, then follow these steps :
>> >> >
>> >> > add following properties in your core-site.xml -
>> >> >
>> >> > <property>
>> >> >          <name>fs.default.name</name>
>> >> >          <value>hdfs://localhost:9000/</value>
>> >> >        </property>
>> >> >
>> >> >        <property>
>> >> >          <name>hadoop.tmp.dir</name>
>> >> >          <value>/home/mohammad/hdfs/temp</value>
>> >> >        </property>
>> >> >
>> >> > then add following two properties in your hdfs-site.xml -
>> >> >
>> >> > <property>
>> >> >                <name>dfs.replication</name>
>> >> >                <value>1</value>
>> >> >        </property>
>> >> >
>> >> >        <property>
>> >> >                <name>dfs.name.dir</name>
>> >> >                <value>/home/mohammad/hdfs/name</value>
>> >> >        </property>
>> >> >
>> >> >        <property>
>> >> >                <name>dfs.data.dir</name>
>> >> >                <value>/home/mohammad/hdfs/data</value>
>> >> >        </property>
>> >> >
>> >> > finally add this property in your mapred-site.xml -
>> >> >
>> >> >       <property>
>> >> >          <name>mapred.job.tracker</name>
>> >> >          <value>hdfs://localhost:9001</value>
>> >> >        </property>
>> >> >
>> >> > NOTE: you can give any name to these directories of your choice, just
>> >> > keep in mind you have to give same names as values of
>> >> >           above specified properties in your configuration files.
>> >> > (give full path of these directories, not just the name of the
>> >> > directory)
>> >> >
>> >> > After this  follow the steps provided in the previous reply.
>> >> >
>> >> > Regards,
>> >> >     Mohammad Tariq
>> >> >
>> >> >
>> >> > On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan <[EMAIL PROTECTED]>
>> >> > wrote:
>> >> >> thank's Mohammad
>> >> >>
>> >> >> with this command:
>> >> >>
>> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode -format
>> >> >>
>> >> >> this is my output:
>> >> >>
>> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
>> >> >> /************************************************************
>> >> >> STARTUP_MSG: Starting NameNode
>> >> >> STARTUP_MSG:   host = ubuntu/127.0.1.1
>> >> >> STARTUP_MSG:   args = [-format]
>> >> >> STARTUP_MSG:   version = 0.20.2
>> >> >> STARTUP_MSG:   build >> >> >> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
>> >> >> -r
>> >> >> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
>> >> >> ************************************************************/