Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Pig >> mail # user >> how to use hcatalog DDL in pig?


+
lulynn_2008 2013-07-19, 09:43
Copy link to this message
-
Re: how to use hcatalog DDL in pig?
Hi,

I don't think there is any document (which should be fixed). But I can find
some example in e2e test:

grunt> sql drop table if exists pig_hcat_ddl_1;
grunt> sql create table pig_hcat_ddl_1(name string, age int, gpa double);
grunt> stored as textfile;

Also, you should set the following properties in the pig.properties file:

pig.sql.type=hcat (backend of sql, hcat is the only sql backend now)
hcat.bin=/usr/local/hcat/bin/hcat (binary location for hcat)

I don't have hcat set up myself, so cannot test this. But this should work.

Thanks,
Cheolsoo
On Fri, Jul 19, 2013 at 2:43 AM, lulynn_2008 <[EMAIL PROTECTED]> wrote:

>  Hi All,
> I find pig has achieved HCatalog DDL integration from version 0.11.0 via
> https://issues.apache.org/jira/browse/PIG-2482. But I did not find any
> documents to show how to use this function. Could you please provide the
> steps of how to use hcatalog DDL in pig?
>
> I am trying this with pig-0.11.1, but encounter following errors:
> grunt> hcat -e create table h1(a int);
> 2013-07-19 17:34:07,512 [main] ERROR org.apache.pig.tools.grunt.Grunt -
> ERROR 1000: Error during parsing. Encountered " <IDENTIFIER> "hcat "" at
> line 2, column 1.
> Was expecting one of:
>     <EOF>
>     "cat" ...
>     "clear" ...
>     "fs" ...
>     "sh" ...
>     "cd" ...
>     "cp" ...
>     "copyFromLocal" ...
>     "copyToLocal" ...
>     "dump" ...
>     "describe" ...
>     "aliases" ...
>     "explain" ...
>     "help" ...
>     "history" ...
>     "kill" ...
>     "ls" ...
>     "mv" ...
>     "mkdir" ...
>     "pwd" ...
>     "quit" ...
>     "register" ...
>     "rm" ...
>     "rmf" ...
>     "set" ...
>     "illustrate" ...
>     "run" ...
>     "exec" ...
>     "scriptDone" ...
>     "" ...
>     "" ...
>     <EOL> ...
>     ";" ...
>
>
>