Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
HDFS >> mail # user >> Permission related errors when running with a different user


Copy link to this message
-
Permission related errors when running with a different user
Hi,
  I am running a job with a different user than the one Hadoop is installed
with, and getting the following error. Please help resolve it. This is
actually an YARN job that I am trying to run.

2012-12-06 09:29:13,997 INFO  Client (Client.java:prepareJarResource(293))
- Copy App Master jar from local filesystem and add to local environment
2012-12-06 09:29:14,476 FATAL Client (Client.java:main(148)) - Error
running CLient
org.apache.hadoop.security.AccessControlException: Permission denied:
user=root, access=WRITE, inode="/":kbonagir:supergroup:drwxr-xr-x
        at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
        at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186)
        at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135)
        at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4203)
        at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4174)
        at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1574)
        at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1509)
        at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:410)
        at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:200)
        at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:42590)
        at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:427)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1692)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1688)
        at
java.security.AccessController.doPrivileged(AccessController.java:284)
        at javax.security.auth.Subject.doAs(Subject.java:573)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1686)

        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:56)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:39)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:527)
        at
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:90)
        at
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57)
        at
org.apache.hadoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:1250)
        at
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1266)
        at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1090)
        at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1048)
        at
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:232)
        at
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:75)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:804)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:785)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:684)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:259)
        at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:232)
        at
org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1817)
        at Client.prepareJarResource(Client.java:299)
        at Client.launchAndMonitorAM(Client.java:509)
        at Client.main(Client.java:146)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
        at java.lang.reflect.Method.invoke(Method.java:611)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
Caused by: org.apache.hadoop.security.AccessControlException: Permission
denied: user=root, access=WRITE, inode="/":kbonagir:supergroup:drwxr-xr-x
        at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
        at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186)
        at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135)
        at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4203)
        at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4174)
        at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1574)
        at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1509)
        at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:410)
        at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:200)
        at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:42590)
        at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:427)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1692)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:16
+
Harsh J 2012-12-06, 15:28
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB