-Re: Tables and importing
Viral Bajaria 2010-12-16, 00:48
I had another question about the LOAD DATA LOCAL INPATH
it does a exec.CopyTask first and then does a exec.MoveTask but I think it
is leaving a file handle open because when I run a thread which bulk inserts
data into partitions the open file count for the user under which my hive
thrift server runs keeps on increasing till it reaches the max after which i
start getting "connection refused".
I am currently still on hive 0.5.0 and have checked out the svn repository
but can't exactly figure out the location where the files are being left
anyone aware of similar problems ?
On Wed, Dec 15, 2010 at 3:51 PM, Leo Alekseyev <[EMAIL PROTECTED]> wrote:
> You can use CREATE EXTERNAL TABLE... LOCATION.
> See http://wiki.apache.org/hadoop/Hive/LanguageManual/DDL and examples
> thereof. When you LOAD DATA INPATH, the directory gets moved to the
> Hive warehouse dir; it does not get modified.
> On Wed, Dec 15, 2010 at 3:33 PM, Mark <[EMAIL PROTECTED]> wrote:
> > Can someone explain what actually happens when you create a table and
> > data into using "LOAD DATA INPATH..."
> > I noticed that when I load the data from files already existing in HDFS
> > it actually removes the original file from its location and moves it
> > the /user/hive directory. Is there anyway I can prevent this from
> > or is this just the way things work? At this point is the file modified
> > anyway? I have some other Hadoop jobs that rely on this data. Should I
> > update those jobs to operate on the data within these directories? Thanks