Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HBase >> mail # user >> "GSSException: No valid credentials provided" happens when user re-login has been done


Copy link to this message
-
Re: "GSSException: No valid credentials provided" happens when user re-login has been done
Have you read http://hbase.apache.org/book.html#d0e5135 ?

See description w.r.t. principal's maxrenewlife
On Sun, Sep 15, 2013 at 3:57 AM, Sujun Cheng <[EMAIL PROTECTED]> wrote:

> Hi all,
>
> In our project, we met a problem that though we use use
> checkTGTAndReloginFromKeytab() every time when we do the hbase access, but
> the "SASL authentication failed" still will happen after 1-2 days running,
> the error log is shown below. If restart the program, it will runs normal,
> but exception will happen again after running same time. So we used another
> method hope can solve it, we will catch exception such as SaslException and
> then let user login again, but this problem still exists. What the reason
> of that and how can we solve it? Many thanks!
>
> java.lang.RuntimeException: SASL authentication failed. The most likely
> cause is missing or invalid credentials. Consider 'kinit'.
> at
>
> org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection$1.run(SecureClient.java:242)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1212)
> at sun.reflect.GeneratedMethodAccessor33.invoke(Unknown Source)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.hadoop.hbase.util.Methods.call(Methods.java:37)
> at org.apache.hadoop.hbase.security.User.call(User.java:590)
> at org.apache.hadoop.hbase.security.User.access$700(User.java:51)
> at
> org.apache.hadoop.hbase.security.User$SecureHadoopUser.runAs(User.java:444)
> at
>
> org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.handleSaslConnectionFailure(SecureClient.java:203)
> at
>
> org.apache.hadoop.hbase.ipc.SecureClient$SecureConnection.setupIOstreams(SecureClient.java:291)
> at
>
> org.apache.hadoop.hbase.ipc.HBaseClient.getConnection(HBaseClient.java:1124)
> at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:974)
> at
>
> org.apache.hadoop.hbase.ipc.SecureRpcEngine$Invoker.invoke(SecureRpcEngine.java:104)
> at $Proxy7.getClosestRowBefore(Unknown Source)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1016)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:882)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:984)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:886)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:843)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatchCallback(HConnectionManager.java:1533)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatch(HConnectionManager.java:1418)
> at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:918)
> at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:774)
> at org.apache.hadoop.hbase.client.HTable.put(HTable.java:749)
> at
>
> org.apache.hadoop.hbase.client.HTablePool$PooledHTable.put(HTablePool.java:394)
> ..................
> backtype.storm.daemon.executor$eval3836$fn_*3837$tuple_action_fn*
> _3839.invoke(executor.clj:566)
> at
>
> backtype.storm.daemon.executor$mk_task_receiver$fn__3760.invoke(executor.clj:345)
> at
>
> backtype.storm.disruptor$clojure_handler$reify__1583.onEvent(disruptor.clj:43)
> at
>
> backtype.storm.utils.DisruptorQueue.consumeBatchToCursor(DisruptorQueue.java:84)
> at
>
> backtype.storm.utils.DisruptorQueue.consumeBatchWhenAvailable(DisruptorQueue.java:58)
> at
>
> backtype.storm.disruptor$consume_batch_when_available.invoke(disruptor.clj:62)
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB