Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Hive >> mail # user >> hi all


+
shaik ahamed 2012-06-26, 07:42
+
Bejoy KS 2012-06-26, 08:14
+
shaik ahamed 2012-07-06, 11:39
+
Bejoy KS 2012-07-06, 11:52
+
shaik ahamed 2012-07-06, 12:47
+
Nitin Pawar 2012-07-06, 12:57
+
shaik ahamed 2012-07-11, 14:39
+
Mapred Learn 2012-07-11, 14:48
Hi Shaik

If you already have the data in hdfs then just create an External Table with that hdfs location. You'll have the data in your hive table.

Or if you want to have a managed table then also it is good use a Load data statement. It'd be faster as well since it is a hdfs move operation under the hood that requires just some change in hdfs metadata.

Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: shaik ahamed <[EMAIL PROTECTED]>
Date: Wed, 11 Jul 2012 20:09:07
To: <[EMAIL PROTECTED]>
Reply-To: [EMAIL PROTECTED]
Subject: hi all

Hi All,

           As i have a data of 100GB in HDFS as i want this  100 gb file to
move or copy to the hive directory or path how can i achieve this .

Is there any cmd to run this.

Please provide me a solution where i can load fast ...
Thanks in advance

Shaik

+
Mohammad Tariq 2012-07-11, 14:45
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB