Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
HDFS, mail # user - Re: kerberos principal don't replace _HTTP to their own host name


Copy link to this message
-
Re: kerberos principal don't replace _HTTP to their own host name
Daryn Sharp 2013-04-01, 14:20
Hi,

While it would be nice if this setting followed the convention of allowing _HOST, the auth handler is independent of the common project which implements that behavior.  The good news is I believe the config key is only used by the NN so you shouldn't have to reconfigure all your nodes.  Have you tested if a client fails when the config has a specific hostname?

Daryn

On Apr 1, 2013, at 2:50 AM, Oh Seok Keun wrote:

Hi All

I upgraded my hadoop cluster version to v1.1.2 last week. And I configured hadoop security with kerberos.
When I configure some configuration for authentication hadoop http, I failed to start NameNode web-server.
When I configure 'hadoop.http.authentication.kerberos.principal' with proper host name (ex. HTTP/[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>), NameNode is doing well. But I can't configure every node(hundred machine) with each host name.
I guess new SPNEGO feature can't replace _HTTP with host's domain name. Is right?

My configuration and log  are following:

# configuration

<property>
  <name>hadoop.http.filter.initializers</name>
  <value>org.apache.hadoop.security.AuthenticationFilterInitializer</value>
</property>
<property>
  <name>hadoop.http.authentication.type</name>
  <value>kerberos</value>
</property>
<property>
  <name>hadoop.http.authentication.token.validity</name>
  <value>36000</value>
</property>
<property>
  <name>hadoop.http.authentication.signature.secret.file</name>
  <value>/hadoop/security/conf/hadoop-http-auth-signature-secret</value>
</property>
<property>
  <name>hadoop.http.authentication.cookie.domain</name>
  <value>hadoop.com<http://hadoop.com/></value>
</property>
<property>
  <name>hadoop.http.authentication.simple.anonymous.allowed</name>
  <value>false</value>
</property>
<property>
  <name>hadoop.http.authentication.kerberos.principal</name>
  <value>HTTP/[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]></value>
</property>
<property>
  <name>hadoop.http.authentication.kerberos.keytab</name>
  <value>/hadoop/security/keytab/hdfs.keytab</value>
</property>
# log

2013-04-01 16:37:25,720 INFO org.apache.hadoop.http.HttpServer: dfs.webhdfs.enabled = false
2013-04-01 16:37:25,721 INFO org.apache.hadoop.http.HttpServer: Adding Kerberos (SPNEGO) filter to getDelegationToken
2013-04-01 16:37:25,722 INFO org.apache.hadoop.http.HttpServer: Adding Kerberos (SPNEGO) filter to renewDelegationToken
2013-04-01 16:37:25,723 INFO org.apache.hadoop.http.HttpServer: Adding Kerberos (SPNEGO) filter to cancelDelegationToken
2013-04-01 16:37:25,723 INFO org.apache.hadoop.http.HttpServer: Adding Kerberos (SPNEGO) filter to fsck
2013-04-01 16:37:25,724 INFO org.apache.hadoop.http.HttpServer: Adding Kerberos (SPNEGO) filter to getimage
2013-04-01 16:37:25,728 INFO org.apache.hadoop.http.HttpServer: Port returned by webServer.getConnectors()[0].getLocalPort() before open() is -1. Opening the listener on 50070
2013-04-01 16:37:25,730 INFO org.apache.hadoop.http.HttpServer: listener.getLocalPort() returned 50070 webServer.getConnectors()[0].getLocalPort() returned 50070
2013-04-01 16:37:25,730 INFO org.apache.hadoop.http.HttpServer: Jetty bound to port 50070
2013-04-01 16:37:25,730 INFO org.mortbay.log: jetty-6.1.26
2013-04-01 16:37:26,091 INFO org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler: Login using keytab /hadoop/security/keytab/hdfs.keytab, for principal HTTP/[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>
2013-04-01 16:37:26,113 WARN org.mortbay.log: failed authentication: javax.servlet.ServletException: javax.security.auth.login.LoginException: Unable to obtain password from user

2013-04-01 16:37:26,114 WARN org.mortbay.log: Failed startup of context org.mortbay.jetty.webapp.WebAppContext@3e7bfc04{/,file:/hadoop/webapps/hdfs}
javax.servlet.ServletException: javax.security.auth.login.LoginException: Unable to obtain password from user

        at org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler.init(KerberosAuthenticationHandler.java:178)
        at org.apache.hadoop.security.authentication.server.AuthenticationFilter.init(AuthenticationFilter.java:146)
        at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:97)
        at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
        at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:713)
        at org.mortbay.jetty.servlet.Context.startContext(Context.java:140)
        at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1282)
        at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:518)
        at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:499)
        at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
        at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152)
        at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:156)
        at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
        at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130)
        at org.mortbay.jetty.Server.doStart(Server.java:224)
        at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
        at org.apache.hadoop.http.HttpServer.start(HttpServer.java:631)
        at org.apache.hadoop.hdfs.server.namenode.NameNode$1.run(NameNode.java:484)
        at org.apache.hadoop.hdfs.server.namenode.NameNode$1.run(NameNode.java:362)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:362)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:313)
        at org.apa
+
Oh Seok Keun 2013-04-02, 05:18