venkatramanan 2013-01-25, 13:00
You can just create an external table and give it location as the hdfs dir where the data resides.
No need to perform an explicit LOAD operation here.
Sent from remote device, Please excuse typos
From: venkatramanan <[EMAIL PROTECTED]>
Date: Fri, 25 Jan 2013 18:30:29
To: <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: LOAD HDFS into Hive
I need to load the hdfs data into the Hive table.
Am having the twitter data and its updated daily using the streaming
API. These twitter responses are stored into the HDFS Path named like
('TwitterData'). After that i try to load the data into the Hive. using
the 'LOAD DATA stmt'. My problem is that hdfs data is lost once i load
the data. is there any way to load the data without the hdfs data lose.
To Create the Table using the below stmt;
CREATE EXTERNAL TABLE Tweets (FromUserId String, Text string,
FromUserIdString String, FromUser String, Geo String, Id BIGINT,
IsoLangCode string, ToUserId INT, ToUserIdString string, CreatedAt
string) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' LINES TERMINATED
To LOAD the data using the below stmt;
LOAD DATA INPATH '/twitter_sample' INTO TABLE tweets;
thanks in advance
venkatramanan 2013-01-25, 13:13
venkatramanan 2013-01-25, 13:17