The -test utility of the "fs" shell does not return outputs, but is to
be used via exit codes. See
http://tldp.org/LDP/abs/html/exit-status.html for more reading
material on what an exit code implies.
Hence, you should be doing it this way:
hadoop fs -test -d /user/lnindrakrishna/$DIRECTORY
… for your script to work.
On Sat, Mar 23, 2013 at 11:12 PM, L N <[EMAIL PROTECTED]> wrote:
> I am trying to delete a hdfs directory if it already exists in a shell
> I wrote below command in a shell script
> DIRECTORY=`date +%m%d%Y`
> TestDir= hadoop fs -test -d /user/lnindrakrishna/$DIRECTORY
> echo $TestDir
> if [ $TestDir -eq 0 ]; then
> hadoop fs -rmr /user/lnindrakrishna/$DIRECTORY
> echo "Directory Deleted"
> echo "Directory does not Exist"
> exec hadoop fs -mkdir /user/lnindrakrishna/`date +%m%d%Y`
> I get below output
> [lnindrakrishna@lvshdc2en0011 ~]$ sh PXP_EAP_Process.sh
> PXP_EAP_Process.sh: line 9: [: -eq: unary operator expected
> Directory does not Exist
> mkdir: cannot create directory /user/lnindrakrishna/03232013: File exists
> Looks like hadoop fs -test -d /user/lnindrakrishna/$DIRECTORY is returning
> NULL and that is the reason it throws null value for echo $TestDir and it
> goes to else part and displays "Directory does not Exist"
> What is wrong in the above shell script that I have written