Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
HDFS >> mail # user >>


+
Amith D K 2012-06-30, 06:29
You need to get a ticket before running any Hdfs command as:

kinit <userid>@<domain>

Sent from my iPhone

On Jun 29, 2012, at 11:29 PM, Amith D K <[EMAIL PROTECTED]> wrote:

> I started Hadoop in secure mode
>
>  
>
> when I issue FSCK I got
>
> Is there any configuration at client side to be configured?
>
>  
>
>  
>
> Thanks and Regards
>
> Amith
>
>  
>
> host-xx.xx.xx.xx:/home/amith/secure/hadoop-2.0.1 # ./bin/hdfs fsck /
> 12/06/29 17:46:10 ERROR security.UserGroupInformation: PriviledgedActionException as:root (auth:KERBEROS) cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
> 12/06/29 17:46:10 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
> 12/06/29 17:46:10 ERROR security.UserGroupInformation: PriviledgedActionException as:root (auth:KERBEROS) cause:java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
> 12/06/29 17:46:10 ERROR security.UserGroupInformation: PriviledgedActionException as:root (auth:KERBEROS) cause:java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "host-xx.xx.xx.xx/10.18.40.95"; destination host is: "host-xx.xx.xx.xx":9000;
> Exception in thread "main" java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "host-xx.xx.xx.xx/10.18.40.95"; destination host is: "host-xx.xx.xx.xx":9000;
>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:760)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1169)
>         at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:184)
>         at $Proxy9.getFileInfo(Unknown Source)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:165)
>         at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:84)
>         at $Proxy9.getFileInfo(Unknown Source)
>         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:612)
>         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1382)
>         at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:748)
>         at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1197)
>         at org.apache.hadoop.hdfs.HAUtil.getAddressOfActive(HAUtil.java:288)
>         at org.apache.hadoop.hdfs.tools.DFSck.getCurrentNamenodeAddress(DFSck.java:224)
>         at org.apache.hadoop.hdfs.tools.DFSck.doWork(DFSck.java:230)
>         at org.apache.hadoop.hdfs.tools.DFSck.access$000(DFSck.java:70)
>         at org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:129)
>         at org.apache.hadoop.hdfs.tools.DFSck$1.run(DFSck.java:126)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
+
Amith D K 2012-07-02, 04:53
+
Ivan Frain 2012-07-02, 12:23
+
Vinod Kumar Vavilapalli 2012-07-03, 02:05
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB