Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HDFS, mail # user - Permission related errors when running with a different user


Copy link to this message
-
Re: Permission related errors when running with a different user
Harsh J 2012-12-06, 15:28
You are attempting a job submit operation as user "root" over HDFS and
MR. Running a job involves placing the requisite files on HDFS, so MR
can leverage its distributed presence to run task work.

The files are usually placed under a user's HDFS home directory, which
is of the form /user/[NAME]. By default, HDFS has no notions of a user
existing in it (Imagine a linux user account with no home-dir yet). So
you'll first have to provision the user with a home directory he can
own for himself, as the HDFS administrator:

Create and grant ownership to the user root, his own homedir:

sudo -u hdfs hadoop fs -mkdir -p /user/root/
sudo -u hdfs hadoop fs -chown root:root /user/root

Once this is done, you can try to resubmit the job and the
AccessControlException should be resolved.

On Thu, Dec 6, 2012 at 8:28 PM, Krishna Kishore Bonagiri
<[EMAIL PROTECTED]> wrote:
> Hi,
>   I am running a job with a different user than the one Hadoop is installed
> with, and getting the following error. Please help resolve it. This is
> actually an YARN job that I am trying to run.
>
> 2012-12-06 09:29:13,997 INFO  Client (Client.java:prepareJarResource(293)) -
> Copy App Master jar from local filesystem and add to local environment
> 2012-12-06 09:29:14,476 FATAL Client (Client.java:main(148)) - Error running
> CLient
> org.apache.hadoop.security.AccessControlException: Permission denied:
> user=root, access=WRITE, inode="/":kbonagir:supergroup:drwxr-xr-x
>         at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4203)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4174)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1574)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1509)
>         at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:410)
>         at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:200)
>         at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:42590)
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:427)
>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1692)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1688)
>         at
> java.security.AccessController.doPrivileged(AccessController.java:284)
>         at javax.security.auth.Subject.doAs(Subject.java:573)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1686)
>
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:56)
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:39)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:527)
>         at
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:90)
>         at
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57)
>         at
> org.apache.hadoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:1250)
>         at

Harsh J