Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive >> mail # user >> unable to see the file

Copy link to this message
Re: unable to see the file
Hi shaik

You have given the table location as '/user/hive/warehouse' . When you load data into that table and if you have used overwrite, first the directory is deleted and then created and new file is loaded.

So here your hive warehouse dir is same as your table location. So when you used Load data with overwrite all the table's dirs also
Got deleted.

You can recover the data only if trash is enabled in hdfs.

Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: shaik ahamed <[EMAIL PROTECTED]>
Date: Thu, 26 Jul 2012 19:39:01
Subject: unable to see the file

Hi Users,

Before creating the table i enable the below cmds
*set hive.exec.compress.output=true;
set io.seqfile.compression.type=BLOCK;*

             As i created an external table with the below syntax
              *CREATE EXTERNAL TABLE test_data(vender string,supplier
string,order_date string,quantity int) row format delimited fields
terminated by ' ' stored as textfile *
*LOCATION '/user/hive/warehouse';*

After that loaded data into test_data .

I am not able to see the table in the path(HDFS path) which i given before
creating it ..  '/user/hive/warehouse' and the other table which are
already exists that also im not able to see.

please reply me with the above concern and tell me how i can see the table .

how can see get back the old exiting hdfs files