To my understanding, there is not a "HCatalog" service in Cloudera Manager,
and I don't have to install hcatalog using RPM.

I tried HCatalog using sqoop1, but could not write Parquet format. Here are
what I did:

1. hadoop fs -mkdir /tmp/action_t
2. hive> create external table action_t ( ...) stored as parquet location
'/tmp/action_t';
3. sqoop import --connect jdbc:teradata://teraserver/DATABASE=PDMPUBLIC
org.apache.hadoop.io.compress.SnappyCodec -- --batch-size 1000

The problems are:
1. the job finished successfully, but the file in /tmp/action_t/_TEMP is in
text format.
2. If I use "--hcatalog-table action_text --create-hcatalog-table", the
data is not loaded into Hive.

Did I miss something?
On Tue, Jun 17, 2014 at 5:57 PM, Venkat Ranganathan <
[EMAIL PROTECTED]> wrote:
 
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB