Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Hive >> mail # user >> LOAD HDFS into Hive


+
venkatramanan 2013-01-25, 13:00
+
bejoy_ks@... 2013-01-25, 13:07
+
venkatramanan 2013-01-25, 13:13
Copy link to this message
-
Re: LOAD HDFS into Hive
hi Bejoy KS,

This is working perfectly... thanks

On Friday 25 January 2013 06:43 PM, venkatramanan wrote:
> thanks for your reply.... i will try and get back to you
>
> thanks,
> Venkat
>
> On Friday 25 January 2013 06:37 PM, [EMAIL PROTECTED] wrote:
>> Hi Venkataraman
>>
>> You can just create an external table and give it location as the
>> hdfs dir where the data resides.
>>
>> No need to perform an explicit LOAD operation here.
>>
>> Regards
>> Bejoy KS
>>
>> Sent from remote device, Please excuse typos
>> ------------------------------------------------------------------------
>> *From: * venkatramanan <[EMAIL PROTECTED]>
>> *Date: *Fri, 25 Jan 2013 18:30:29 +0530
>> *To: *<[EMAIL PROTECTED]>
>> *ReplyTo: * [EMAIL PROTECTED]
>> *Subject: *LOAD HDFS into Hive
>>
>> Hi,
>>
>> I need to load the hdfs data into the Hive table.
>>
>> For example,
>>
>> Am having the twitter data and its updated daily using the streaming
>> API. These twitter responses are stored into the HDFS Path named like
>> ('TwitterData'). After that i try to load the data into the Hive.
>> using the 'LOAD DATA stmt'. My problem is that hdfs data is lost once
>> i load the data. is there any way to load the data without the hdfs
>> data lose.
>>
>> To Create the Table using the below stmt;
>>
>> CREATE EXTERNAL TABLE Tweets (FromUserId String, Text string,
>> FromUserIdString String, FromUser String, Geo String, Id BIGINT,
>> IsoLangCode string, ToUserId INT, ToUserIdString string, CreatedAt
>> string) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' LINES
>> TERMINATED BY '\n';
>>
>> To LOAD the data using the below stmt;
>>
>> LOAD DATA INPATH '/twitter_sample' INTO TABLE tweets;
>>
>> thanks in advance
>>
>> Thanks,
>> Venkat
>
>
> --
>
--