Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Hadoop, mail # user - How can I kill a file with bad permissions


+
Steve Lewis 2013-05-20, 13:02
Copy link to this message
-
Re: How can I kill a file with bad permissions
Harsh J 2013-05-20, 14:16
You should be able to change permissions at will, as the owner of the
entry. There is certainly no bug there, as demonstrated by below in HDFS
2.x:

➜  ~  hadoop fs -ls
Found 1 item
drwxr-xr-x    - harsh harsh          0 2013-04-10 07:03 bin
➜  ~  hadoop fs -chmod 411 bin
➜  ~  hadoop fs -ls bin
ls: Permission denied: user=harsh, access=READ_EXECUTE,
inode="/user/harsh/bin":harsh:harsh:dr----x--x
➜  ~  hadoop fs -chmod 755 bin
➜  ~  hadoop fs -ls bin
Found 2 items
-rw-r--r--   3 harsh harsh     693508 2013-04-10 07:03 bin/wordcount
-rw-r--r--   3 harsh harsh     873736 2013-04-10 06:56 bin/wordcount-simple

On Mon, May 20, 2013 at 6:32 PM, Steve Lewis <[EMAIL PROTECTED]> wrote:

> In trying to set file permissions using the Java API I managed to set the
> permissions on a directory to
> dr----x--x
>
> Now I can neither change them or get rid of the file
> I tried fs -rmr  but  I get permission issues
>
>
>
> --
>

--
Harsh J