Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce, mail # user - DFS Permissions on Hadoop 2.x


Copy link to this message
-
Re: DFS Permissions on Hadoop 2.x
Chris Nauroth 2013-06-18, 20:28
Prashant, can you provide more details about what you're doing when you see
this error?  Are you submitting a MapReduce job, running an HDFS shell
command, or doing some other action?  It's possible that we're also seeing
an interaction with some other change in 2.x that triggers a setPermission
call that wasn't there in 0.20.2.  I think the problem with the HDFS
setPermission API is present in both 0.20.2 and 2.x, but if the code in
0.20.2 never triggered a setPermission call for your usage, then you
wouldn't have seen the problem.

I'd like to gather these details for submitting a new bug report to HDFS.
 Thanks!

Chris Nauroth
Hortonworks
http://hortonworks.com/

On Tue, Jun 18, 2013 at 12:14 PM, Leo Leung <[EMAIL PROTECTED]> wrote:

>  I believe, the properties name should be “dfs.permissions”****
>
> ** **
>
> ** **
>
> *From:* Prashant Kommireddi [mailto:[EMAIL PROTECTED]]
> *Sent:* Tuesday, June 18, 2013 10:54 AM
> *To:* [EMAIL PROTECTED]
> *Subject:* DFS Permissions on Hadoop 2.x****
>
> ** **
>
> Hello,****
>
> ** **
>
> We just upgraded our cluster from 0.20.2 to 2.x (with HA) and had a
> question around disabling dfs permissions on the latter version. For some
> reason, setting the following config does not seem to work****
>
> ** **
>
> <property>****
>
>         <name>dfs.permissions.enabled</name>****
>
>         <value>false</value>****
>
> </property>****
>
> ** **
>
> Any other configs that might be needed for this? ****
>
> ** **
>
> Here is the stacktrace. ****
>
> ** **
>
> 2013-06-17 17:35:45,429 INFO  ipc.Server - IPC Server handler 62 on 8020,
> call org.apache.hadoop.hdfs.protocol.ClientProtocol.setPermission from
> 10.0.53.131:24059: error:
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=smehta, access=EXECUTE,
> inode="/mapred":pkommireddi:supergroup:drwxrwx---****
>
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=smehta, access=EXECUTE,
> inode="/mapred":pkommireddi:supergroup:drwxrwx---****
>
>         at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
> ****
>
>         at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:161)
> ****
>
>         at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:128)
> ****
>
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4684)
> ****
>
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkOwner(FSNamesystem.java:4640)
> ****
>
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermissionInt(FSNamesystem.java:1134)
> ****
>
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermission(FSNamesystem.java:1111)
> ****
>
>         at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setPermission(NameNodeRpcServer.java:454)
> ****
>
>         at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setPermission(ClientNamenodeProtocolServerSideTranslatorPB.java:253)
> ****
>
>         at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44074)
> ****
>
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
> ****
>
>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1002)****
>
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1695)***
> *
>
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1691)***
> *
>
>         at java.security.AccessController.doPrivileged(Native Method)****
>
>         at javax.security.auth.Subject.doAs(Subject.java:396)****
>
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
> ****
>
>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1689)****