Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
HDFS, mail # user - Re: Urgent Requirement: How to copy File from One cluster to another cluster using java client(throught java Program)


Copy link to this message
-
Re: Urgent Requirement: How to copy File from One cluster to another cluster using java client(throught java Program)
Harsh J 2013-03-02, 05:45
Samir,

It was pointed out by another member in your earlier post but here it is
again. The error returned is sorta clear enough:

org.apache.hadoop.security.AccessControlException: Permission denied:
user=hadoop, access=WRITE,
inode="/user/dasmohap/samir_tmp":dasmohap:dasmohap:drwxr-xr-x

1. Your program that is trying to write a file is running as a user called
"hadoop".
2. The target location/cluster you are writing to, under a
path /user/dasmohap/samir_tmp, is owned by the user "dasmohap" there.
3. The target path has the permission 755, preventing writes by any other
user except its owner.
4. Your program, as mentioned in (1), tries to write as user "hadoop", and
fails due to lack of allowing permission on the target.

If you need the program to write there, you have to give it permissions to
write first. You need to make /user/dasmohap/samir_tmp more globally
accessible for writes, with perhaps a 775 (if you can and know how to setup
groups), or 777 (if you want a quick fix and aren't concerned in any way
about permissions and security).

Hope this helps.
On Sat, Mar 2, 2013 at 1:13 AM, samir das mohapatra <[EMAIL PROTECTED]
> wrote:

> Hi All,
>     Any one had gone through scenario to  copy file from one cluster to
> another cluster using java application program (Using Hadoop FileSystem
> API) .
>
>   I have done some thing using java application but within the same
> cluster it is working file while I am copying the file from one cluster to
> another cluster I am getting the error.
>
> File not found org.apache.hadoop.security.
> AccessControlException: Permission denied: user=hadoop, access=WRITE,
> inode="/user/dasmohap/samir_tmp":dasmohap:dasmohap:drwxr-xr-x
>     at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:186)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:135)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4547)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:4518)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1755)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:1690)
>     at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1669)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:409)
>     at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:205)
>     at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44068)
>     at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
>     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
>     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:396)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
>
>
> Regards,
> samir.
>
>  --
>
>
>
>

--
Harsh J