Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive >> mail # user >> Hive Queries on S3 Data not working after moving to Hive metastore on CDH4


Copy link to this message
-
Re: Hive Queries on S3 Data not working after moving to Hive metastore on CDH4
Any ideas regarding this ?

For now, i have resolved this issue by putting the amazon credentials into
the Cloudera Manager Hive service safety valve and deploying the new client
configs to the hive gateway nodes.

But this restricts me to using only one amazon account for the Hive
operations.

- Himanish

On Thu, May 2, 2013 at 8:54 AM, Himanish Kushary <[EMAIL PROTECTED]> wrote:

> Hi,
>
> We were running some hive queries off data on amazon S3.In the hive script
> file we are including the access key and secret access key as below.
>
> set fs.s3.awsAccessKeyId=ABCD;
> set fs.s3.awsSecretAccessKey=XYZ;
> set fs.s3n.awsAccessKeyId=ABCD;
> set fs.s3n.awsSecretAccessKey=XYZ;
>
> and then we run the script using hive -f <<scriptfile>>
>
> Everything was working fine until we changed our hive configuration to use
> remote metastore (
> http://www.cloudera.com/content/cloudera-content/cloudera-docs/CDH4/4.2.0/CDH4-Installation-Guide/cdh4ig_topic_18_4.html)
> and also 'Bypass Hive Metastore Server' (
> http://blog.cloudera.com/blog/2013/03/how-to-set-up-cloudera-manager-4-5-for-apache-hive/
> )
>
> Now when we run the same script we get the following error,even though the
> script has set those values :
> *
> *
> *FAILED: Error in metadata:
> MetaException(message:java.lang.IllegalArgumentException: AWS Access Key ID
> and Secret Access Key must be specified as the username or password
> (respectively) of a s3n URL, or by setting the fs.s3n.awsAccessKeyId or
> fs.s3n.awsSecretAccessKey properties (respectively).)*
> *FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask*
>
>
> I do not want to set those values in the hive-site.xml as I may require to
> point to different S3 buckets with different credentials.
>
> Am I missing something in the configuration or in the script ?
>
> --------------------------
> Thanks & Regards
> Himanish
>

--
Thanks & Regards
Himanish
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB