Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce, mail # user - How to Create file in HDFS using java Client with Permission


+
samir das mohapatra 2013-03-01, 11:20
Copy link to this message
-
Re: How to Create file in HDFS using java Client with Permission
Yanbo Liang 2013-03-15, 09:33
You must change to user dasmohap to execute this client program otherwise
you can not create file under the directory "/user/dasmohap".
If you do not have a user called dasmohap at client machine, create it or
hack as these step
http://stackoverflow.com/questions/11371134/how-to-specify-username-when-putting-files-on-hdfs-from-a-remote-machine
.

But I think these hack steps are not security.
The users to upload their own data is make sense.
2013/3/4 anil gupta <[EMAIL PROTECTED]>

> As per the error below, the user trying to write/read the file does not
> have appropriate permission.
>
> File not found org.apache.hadoop.security.
> AccessControlException: Permission denied: user=hadoop, access=WRITE,
> inode="/user/dasmohap/samir_tmp":dasmohap:dasmohap:drwxr-xr-x
>
> HTH,
> Anil
>
> On Fri, Mar 1, 2013 at 3:20 AM, samir das mohapatra <
> [EMAIL PROTECTED]> wrote:
>
>> Hi All,
>>     I wanted to know to to create file in hdfs using java program.
>>
>>   I wrote some code it is working fine in dev cluster but I am getting
>> error in other cluster.
>>
>> Error:
>> Writing data into HDFS...................
>> Creating file
>> File not found org.apache.hadoop.security.AccessControlException:
>> Permission denied: user=hadoop, access=WRITE,
>> inode="/user/dasmohap/samir_tmp":dasmohap:dasmohap:drwxr-xr-x
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4547)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4518)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1755)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:1690)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1669)
>>     at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:409)
>>     at
>> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:205)
>>     at
>> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44068)
>>     at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
>>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
>>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:396)
>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
>>
>>
>> Regards,
>> samir.
>>
>
>
>
> --
> Thanks & Regards,
> Anil Gupta