Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce >> mail # user >> "Server not found in Kerberos database" for MiniKDC server


Copy link to this message
-
"Server not found in Kerberos database" for MiniKDC server
Hi,

For educational purposes, I'm trying to set a minimal working secure Hadoop
cluster on my machine.

What I basically did is:

Add example.com to /etc/hosts

Set a minkdc server. It'll generate krb5.conf and keytab. Generates some
users - {nn,dn,hdfs}@EXAMPLE.COM

Refer Java to krb5.conf with HADOOP_OPTS, as well as a required workaround
for Mac OS X java:

❯ ~/hadoopconf env HADOOP_OPTS
hadoop-env.sh HADOOP_OPTS = -Djava.awt.headless=true
-Djava.security.krb5.conf=/Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
-Djava.net.preferIPv4Stack=true
Set the proper hadoop configuration using the keytab and the hadoop users:

❯ ~/hadoopconf get --local
hdfs-site.xml dfs.datanode.address            = example.com:1004
core-site.xml fs.defaultFS                    = hdfs://example.com
hdfs-site.xml dfs.namenode.keytab.file        /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/keytab
hdfs-site.xml dfs.datanode.hostname           = example.com
hdfs-site.xml dfs.datanode.kerberos.principal = dn/[EMAIL PROTECTED]
hdfs-site.xml dfs.datanode.data.dir           /tmp/hadoop-eleibovi/dfs/data
hdfs-site.xml dfs.datanode.keytab.file        /Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/keytab
hdfs-site.xml dfs.namenode.kerberos.principal = nn/[EMAIL PROTECTED]
core-site.xml hadoop.security.authorization   = true
core-site.xml hadoop.security.authentication  = kerberos
hdfs-site.xml dfs.datanode.dns.interface      = lo0
hdfs-site.xml dfs.datanode.http.address       = example.com:1006

Start the namenode service.
$ ./bin/hdfs
...
14/01/15 19:22:43 INFO ipc.Server: IPC Server listener on 8020: starting
14/01/15 19:22:43 INFO namenode.NameNode: NameNode RPC up at: localhost/
127.0.0.1:8020
14/01/15 19:22:43 INFO namenode.FSNamesystem: Starting services required
for active state

Finally use the following short Java program to contact the namenode:

System.setProperty("java.security.krb5.conf", cwd + "/krb5.conf");
UserGroupInformation.setConfiguration(conf);
        UserGroupInformation ugi = UserGroupInformation.
                loginUserFromKeytabAndReturnUGI("hdfs/EXAMPLE.COM", cwd +
"/keytab");
 ugi.doAs(new PrivilegedExceptionAction<Object>() {
            @Override
            public Object run() throws Exception {
                final FileSystem fs = FileSystem.get(conf);
                fs.getFileStatus(new Path("/"));
         }
}

The exception I got is:

Exception in thread "main" java.io.IOException: Failed on local exception:
java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed
[Caused by GSSException: No valid credentials provided (Mechanism level:
Server not found in Kerberos database (7) - Server not found in Kerberos
database)]; Host Details : local host is: "tlv-mpbxb/127.0.0.1";
destination host is: "example.com":8020;

I'll be glad to any help with debugging the problem.

Thanks,

I attach a full log with Kerberos debug turned on:

args: [-conf,
/Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/etc/hadoop/core-site.xml,
-conf,
/Users/eleibovi/dev/securehadoop/hadoop-2.1.0-beta/etc/hadoop/hdfs-site.xml]
2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess
with annotation
@org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=,
value=[Rate of successful kerberos logins and latency (milliseconds)],
always=false, type=DEFAULT, sampleName=Ops)
2014-01-15 19:29:46 DEBUG MutableMetricsFactory:42 - field
org.apache.hadoop.metrics2.lib.MutableRate
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure
with annotation
@org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=,
value=[Rate of failed kerberos logins and latency (milliseconds)],
always=false, type=DEFAULT, sampleName=Ops)
2014-01-15 19:29:46 DEBUG MetricsSystemImpl:220 - UgiMetrics, User and
group related metrics
Java config name:
/Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
Loaded from Java config
2014-01-15 19:29:46 DEBUG Groups:180 -  Creating new Groups object
2014-01-15 19:29:46 DEBUG NativeCodeLoader:46 - Trying to load the
custom-built native-hadoop library...
2014-01-15 19:29:46 DEBUG NativeCodeLoader:55 - Failed to load
native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in
java.library.path
2014-01-15 19:29:46 DEBUG NativeCodeLoader:56 -
java.library.path=/Users/eleibovi/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
2014-01-15 19:29:46 WARN  NativeCodeLoader:62 - Unable to load
native-hadoop library for your platform... using builtin-java classes where
applicable
2014-01-15 19:29:46 DEBUG JniBasedUnixGroupsMappingWithFallback:40 -
Falling back to shell based
2014-01-15 19:29:46 DEBUG JniBasedUnixGroupsMappingWithFallback:44 - Group
mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
2014-01-15 19:29:46 DEBUG Groups:66 - Group mapping
impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback;
cacheTimeout=300000
Java config name:
/Users/eleibovi/dev/securehadoop/hadoop_rpc_walktrhough/krb5.conf
Loaded from Java config
Added key: 23version: 0
Added key: 17version: 0
Added key: 16version: 0
Added key: 3version: 0
Ordering keys wrt default_tkt_enctypes list
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
Added key: 23version: 0
Added key: 17version: 0
Added key: 16version: 0
Added key: 3version: 0
Ordering keys wrt default_tkt_enctypes list
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 18 17 16 23 1 3.
retries =3, #bytes=158
#bytes=158
Added key: 23version: 0
Added key: 17version: 0
Added key: 16version: 0
Added key: 3version: 0
Or
+
Elazar Leibovich 2014-01-20, 11:34
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB