Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Hive >> mail # user >> question on output hive table to file


+
zuohua zhang 2012-08-07, 04:09
+
Vinod Singh 2012-08-07, 04:43
+
zuohua zhang 2012-08-07, 04:46
+
Vinod Singh 2012-08-07, 04:50
+
zuohua zhang 2012-08-07, 04:57
+
Vinod Singh 2012-08-07, 05:04
+
Gabi D 2012-08-07, 08:46
Copy link to this message
-
RE: question on output hive table to file
I use the following example to set my own delimiter, I hope it's easy to adjust for your own needs:

hive> create external table input (a int, b string, c float) row format delimited fields terminated by "\t" stored as sequencefile location 's3://path/to/data/input/';
hive> create external table output (a int, b string, c float) row format delimited fields terminated by "~" stored as textfile location 's3://path/to/data/output/';
hive> insert overwrite table output select * from input;

Let me know if it works!

Tony
From: zuohua zhang [mailto:[EMAIL PROTECTED]]
Sent: 07 August 2012 05:58
To: [EMAIL PROTECTED]
Subject: Re: question on output hive table to file

Thanks so much!!!!!!!!! that did work. I have 200+ columns so it is quite an ugly thing. No shortcut?
On Mon, Aug 6, 2012 at 9:50 PM, Vinod Singh <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>> wrote:
Change the query to something like-

INSERT OVERWRITE DIRECTORY '/outputable.txt'
select concat(col1, ',', col2, ',', col3)  from myoutputtable;

That way columns will be separated by ,.

Thanks,
Vinod

On Tue, Aug 7, 2012 at 10:16 AM, zuohua zhang <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>> wrote:
I used the following that it won't help?

ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'

On Mon, Aug 6, 2012 at 9:43 PM, Vinod Singh <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>> wrote:
Columns of a Hive table are separated by ^A character. Instead of doing a "SELECT * ", you may like to use concat function to have a separator of your choice.

Thanks,
Vinod

On Tue, Aug 7, 2012 at 9:39 AM, zuohua zhang <[EMAIL PROTECTED]<mailto:[EMAIL PROTECTED]>> wrote:
I have used the following to output a hive table to a file:
DROP TABLE IF EXISTS myoutputable;
CREATE TABLE myoutputtable
ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
STORED AS TEXTFILE
AS
select
*
from originaltable;
INSERT OVERWRITE DIRECTORY '/outputable.txt'
select * from myoutputtable;

then i used
hadoop dfs -getmerge /outputtable.txt /mnt/

but the /mnt/outputtable.txt file shows strange characters ^A in the file. What did I do wrong?

Inbound Email has been scanned for viruses and SPAM
**********************************************************************

This email and any attachments are confidential, protected by copyright and may be legally privileged.  If you are not the intended recipient, then the dissemination or copying of this email is prohibited. If you have received this in error, please notify the sender by replying by email and then delete the email completely from your system.  Neither Sporting Index nor the sender accepts responsibility for any virus, or any other defect which might affect any computer or IT system into which the email is received and/or opened.  It is the responsibility of the recipient to scan the email and no responsibility is accepted for any loss or damage arising in any way from receipt or use of this email.  Sporting Index Ltd is a company registered in England and Wales with company number 2636842, whose registered office is at Gateway House, Milverton Street, London, SE11 4AP.  Sporting Index Ltd is authorised and regulated by the UK Financial Services Authority (reg. no. 150404) and Gambling Commission (reg. no. 000-027343-R-308898-001).  Any financial promotion contained herein has been issued
and approved by Sporting Index Ltd.

Outbound email has been scanned for viruses and SPAM
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB