Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
HDFS >> mail # user >> Hadoop Configuration Issues


+
Björn-Elmar Macek 2012-04-27, 10:01
+
alo alt 2012-04-27, 10:12
+
Björn-Elmar Macek 2012-04-27, 10:39
+
alo alt 2012-04-27, 11:11
+
Björn-Elmar Macek 2012-04-27, 14:28
Copy link to this message
-
RE: Hadoop Configuration Issues
I would suggest changing the hadoop configuration on the slave(s)
You'll have to maintain the delta config amongst the nodes.
-----Original Message-----
From: Björn-Elmar Macek [mailto:[EMAIL PROTECTED]]
Sent: Friday, April 27, 2012 7:28 AM
To: [EMAIL PROTECTED]
Subject: Re: Hadoop Configuration Issues

Hi Alex,

thank you for the tip: it pushed me into the right direction. I was using the deb-package to install hadoop, which did not work out cause of the problems i had. Now i use the tarball archive and unpacked it to a subfolder of my home directory.

BUT i just ran into another problem: after successfully executing "hadoop namenode -format" the "start-all.sh" script (executed on the master server) runs into errors, when trying to access files on the slave system: it automatically seems to expect the hadoop files lying in the very same directory as on the master:

ema@ubuntu:~/Programs/hadoop-1.0.2/bin$ ./start-all.sh
Warning: $HADOOP_HOME is deprecated.

namenode running as process 19393. Stop it first.
slave: bash: line 0: cd: /home/ema/Programs/hadoop-1.0.2/libexec/..: No such file or directory
slave: bash: /home/ema/Programs/hadoop-1.0.2/bin/hadoop-daemon.sh: No such file or directory ema@master's password:
master: Connection closed by UNKNOWN
starting jobtracker, logging to
/var/log/hadoop/ema/hadoop-ema-jobtracker-ubuntu.out
slave: bash: line 0: cd: /home/ema/Programs/hadoop-1.0.2/libexec/..: No such file or directory
slave: bash: /home/ema/Programs/hadoop-1.0.2/bin/hadoop-daemon.sh: No such file or directory

How can i tell the slave, that the files lie somewhere else?

Best regards,
Björn-Elmar

Am 27.04.2012 13:11, schrieb alo alt:
> Hi,
>
> yes, sorry - I saw that after I hit the send botton.
> Looks like you mixed up some configs with wrong templates. I would suggest you use the default configs:
> http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project
> /hadoop-common/src/main/conf/
>
> - Alex
>
> --
> Alexander Lorenz
> http://mapredit.blogspot.com
>
> On Apr 27, 2012, at 12:39 PM, Björn-Elmar Macek wrote:
>
>> Hi Alex,
>>
>> as i have written, i already did so! The problem is as already stated in my mail before, that all the Variables ${bla}  seem to be UNSET - not only SECURITY_TYPE. As i dont really understand those parameters, i would like to use the default ones, which afaik should be configured in the hadoop-env.sh. But obviously they are not.
>>
>> Best,
>> Björn
>>
>> Am 27.04.2012 12:12, schrieb alo alt:
>>> Hi,
>>>
>>> Invalid attribute value for hadoop.security.authentication of
>>> ${SECURITY_TYPE} Set it to simple and it should work (default is kerberos).
>>>
>>> - Alex
>>>
>>> --
>>> Alexander Lorenz
>>> http://mapredit.blogspot.com
>>>
>>> On Apr 27, 2012, at 12:01 PM, Björn-Elmar Macek wrote:
>>>
>>>> Hello,
>>>>
>>>> i have recently installed Hadoop on my and a second machine in order to test the setup and develop little programs locally before deploying them to the cluster. I stumbled over several difficulties, which i could fix with some internet research. But once again im stuck and i think i can nail the problem down:
>>>>
>>>> When Hadoop evaluates the config files in /etc/hadoop it does not have any default values for all the variables used within:
>>>>
>>>> \________ First Error:
>>>> hadoop namenode -format
>>>> Warning: $HADOOP_HOME is deprecated.
>>>>
>>>> 12/04/27 11:31:41 INFO namenode.NameNode: STARTUP_MSG:
>>>> /************************************************************
>>>> STARTUP_MSG: Starting NameNode
>>>> STARTUP_MSG:   host = ubuntu/127.0.1.1
>>>> STARTUP_MSG:   args = [-format]
>>>> STARTUP_MSG:   version = 1.0.1
>>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1243785; compiled by 'hortonfo' on Tue Feb 14 08:13:52 UTC 2012
>>>> ************************************************************/
>>>> 12/04/27 11:31:41 INFO util.GSet: VM type       = 32-bit
>>>> 12/04/27 11:31:41 INFO util.GSet: 2% max memory = 2.475 MB
+
alo alt 2012-04-27, 19:38
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB