Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive >> mail # user >> Error while Creating Table in Hive


Copy link to this message
-
Re: Error while Creating Table in Hive
once we are done with the configuration, we need to format the file
system..use this command to do that-
bin/hadoop namenode -format

after this, hadoop daemon processes should be started using following commands -
bin/start-dfs.sh (it'll start NN & DN)
bin/start-mapred.sh (it'll start JT & TT)

after this use jps to check if everything is alright or point your
browser to localhost:50070..if you further find any problem provide us
with the error logs..:)

Regards,
    Mohammad Tariq
On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan <[EMAIL PROTECTED]> wrote:
> were you able to format hdfs properly???
> I did'nt get your question,Do you mean HADOOP_HOME? or where did I install
> Hadoop?
>
> On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq <[EMAIL PROTECTED]> wrote:
>>
>> if you are getting only this, it means your hadoop is not
>> running..were you able to format hdfs properly???
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan <[EMAIL PROTECTED]> wrote:
>> > Hi MohammadmI irun jps in my shel I can see this result:
>> > 2213 Jps
>> >
>> >
>> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq <[EMAIL PROTECTED]>
>> > wrote:
>> >>
>> >> you can also use "jps" command at your shell to see whether Hadoop
>> >> processes are running or not.
>> >>
>> >> Regards,
>> >>     Mohammad Tariq
>> >>
>> >>
>> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq <[EMAIL PROTECTED]>
>> >> wrote:
>> >> > Hi Babak,
>> >> >
>> >> >  You have to type it in you web browser..Hadoop provides us a web GUI
>> >> > that not only allows us to browse through the file system, but to
>> >> > download the files as well..Apart from that it also provides a web
>> >> > GUI
>> >> > that can be used to see the status of Jobtracker and
>> >> > Tasktracker..When
>> >> > you run a Hive or Pig job or a Mapreduce job, you can point your
>> >> > browser to http://localhost:50030 to see the status and logs of your
>> >> > job.
>> >> >
>> >> > Regards,
>> >> >     Mohammad Tariq
>> >> >
>> >> >
>> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan <[EMAIL PROTECTED]>
>> >> > wrote:
>> >> >> Thank you shashwat for the answer,
>> >> >> where should I type http://localhost:50070?
>> >> >> I typed here: hive>http://localhost:50070 but nothing as result
>> >> >>
>> >> >>
>> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
>> >> >> <[EMAIL PROTECTED]> wrote:
>> >> >>>
>> >> >>> first type http://localhost:50070 whether this is opening or not
>> >> >>> and
>> >> >>> check
>> >> >>> how many nodes are available, check some of the hadoop shell
>> >> >>> commands
>> >> >>> from http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
>> >> >>> run
>> >> >>> example mapreduce task on hadoop take example from here
>> >> >>>
>> >> >>>
>> >> >>> : http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>> >> >>>
>> >> >>> if all the above you can do sucessfully means hadoop is configured
>> >> >>> correctly
>> >> >>>
>> >> >>> Regards
>> >> >>> Shashwat
>> >> >>>
>> >> >>>
>> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan <[EMAIL PROTECTED]>
>> >> >>> wrote:
>> >> >>>>
>> >> >>>> no I'm not working on CDH.Is there a way to test if my Hadoop
>> >> >>>> works
>> >> >>>> fine
>> >> >>>> or not?
>> >> >>>>
>> >> >>>>
>> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS <[EMAIL PROTECTED]>
>> >> >>>> wrote:
>> >> >>>>>
>> >> >>>>> Hi Babak
>> >> >>>>>
>> >> >>>>> You gotta follow those instructions in the apace site to set up
>> >> >>>>> hadoop
>> >> >>>>> from scratch and ensure that hdfs is working first. You should be
>> >> >>>>> able to
>> >> >>>>> read and write files to hdfs before you do your next steps.
>> >> >>>>>
>> >> >>>>> Are you on CDH or apache distribution of hadoop? If it is CDH
>> >> >>>>> there
>> >> >>>>> are
>> >> >>>>> detailed instructions on Cloudera web site.
>> >> >>>>>
>> >> >>>>> Regards
>> >> >>>>> Bejoy KS
>> >> >>>>>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB