-Re: CREATE EXTERNAL TABLE Fails on Some Directories
Dean Wampler 2013-02-15, 16:37
You confirmed that 715 is an actual directory? It didn't become a file by
By the way, you don't need to include the file name in the LOCATION. It
will read all the files in the directory.
On Fri, Feb 15, 2013 at 10:29 AM, Joseph D Antoni <[EMAIL PROTECTED]>wrote:
> I'm trying to create a series of external tables for a time series of data
> (using the prebuilt Cloudera VM).
> The directory structure in HDFS is as such:
> Each directory contains the same set of files, from a different day. They
> were all put into HDFS using the following script:
> for i in *;do hdfs dfs -put $i in $dir;done
> They all show up with the same ownership/perms in HDFS.
> Going into Hive to build the tables, I built a set of scripts to do the
> loads--then did a sed (changing 711 to 712,713, etc) to a file for each
> day. All of my loads work, EXCEPT for 715 and 716.
> Script is as follows:
> create external table 715_table_name
> (col1 string,
> col2 string)
> row format
> delimited fields terminated by ','
> lines terminated by '\n'
> stored as textfile
> location '/715/file.csv';
> This is failing with:
> Error in Metadata MetaException(message:Got except:
> org.apache.hadoop.fs.FileAlreadyExistsException Parent Path is not a
> directory: /715 715...
> Like I mentioned it works for all of the other directories, except 715 and
> 716. Thoughts on troubleshooting path?
> Joey D'Antoni
*Dean Wampler, Ph.D.*