Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> "Server not found in Kerberos database" for MiniKDC server


Copy link to this message
-
Re: "Server not found in Kerberos database" for MiniKDC server
For the sake of completion.

The same settings worked in a Linux box.
On Wed, Jan 15, 2014 at 10:57 PM, Elazar Leibovich <[EMAIL PROTECTED]>wrote:

> Hi,
>
> For educational purposes, I'm trying to set a minimal working secure
> Hadoop cluster on my machine.
>
> What I basically did is:
>
> Add example.com to /etc/hosts
>
> Set a minkdc server. It'll generate krb5.conf and keytab. Generates some
> users - {nn,dn,hdfs}@EXAMPLE.COM
>
> Refer Java to krb5.conf with HADOOP_OPTS, as well as a required workaround
> for Mac OS X java:
>
> ❯ ~/hadoopconf env HADOOP_OPTS
> hadoop-env.sh HADOOP_OPTS = -Djava.awt.headless=true
> -Djava.security.krb5.conf=/Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
> -Djava.net.preferIPv4Stack=true
>
>
> Set the proper hadoop configuration using the keytab and the hadoop users:
>
> ❯ ~/hadoopconf get --local
> hdfs-site.xml dfs.datanode.address            = example.com:1004
> core-site.xml fs.defaultFS                    = hdfs://example.com
> hdfs-site.xml dfs.namenode.keytab.file        > /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/keytab
> hdfs-site.xml dfs.datanode.hostname           = example.com
> hdfs-site.xml dfs.datanode.kerberos.principal = dn/[EMAIL PROTECTED]
> hdfs-site.xml dfs.datanode.data.dir           > /tmp/hadoop-eleibovi/dfs/data
> hdfs-site.xml dfs.datanode.keytab.file        > /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/keytab
> hdfs-site.xml dfs.namenode.kerberos.principal = nn/[EMAIL PROTECTED]
> core-site.xml hadoop.security.authorization   = true
> core-site.xml hadoop.security.authentication  = kerberos
> hdfs-site.xml dfs.datanode.dns.interface      = lo0
> hdfs-site.xml dfs.datanode.http.address       = example.com:1006
>
> Start the namenode service.
> $ ./bin/hdfs
> ...
> 14/01/15 19:22:43 INFO ipc.Server: IPC Server listener on 8020: starting
> 14/01/15 19:22:43 INFO namenode.NameNode: NameNode RPC up at: localhost/
> 127.0.0.1:8020
> 14/01/15 19:22:43 INFO namenode.FSNamesystem: Starting services required
> for active state
>
> Finally use the following short Java program to contact the namenode:
>
> System.setProperty("java.security.krb5.conf", cwd + "/krb5.conf");
> UserGroupInformation.setConfiguration(conf);
>         UserGroupInformation ugi = UserGroupInformation.
>                 loginUserFromKeytabAndReturnUGI("hdfs/EXAMPLE.COM", cwd +
> "/keytab");
>  ugi.doAs(new PrivilegedExceptionAction<Object>() {
>             @Override
>             public Object run() throws Exception {
>                 final FileSystem fs = FileSystem.get(conf);
>                 fs.getFileStatus(new Path("/"));
>          }
> }
>
> The exception I got is:
>
> Exception in thread "main" java.io.IOException: Failed on local exception:
> java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed
> [Caused by GSSException: No valid credentials provided (Mechanism level:
> Server not found in Kerberos database (7) - Server not found in Kerberos
> database)]; Host Details : local host is: "tlv-mpbxb/127.0.0.1";
> destination host is: "example.com":8020;
>
> I'll be glad to any help with debugging the problem.
>
> Thanks,
>
> I attach a full log with Kerberos debug turned on:
>
> args: [-conf,
> /Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/etc/hadoop/core-site.xml,
> -conf,
> /Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/etc/hadoop/hdfs-site.xml]
> 2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field
> org.apache.hadoop.metrics2.lib.MutableRate
> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
> with annotation
> @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=,
> value=[Rate of successful kerberos logins and latency (milliseconds)],
> always=false, type=DEFAULT, sampleName=Ops)
> 2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field
> org.apache.hadoop.metrics2.lib.MutableRate
> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
> with annotation