Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Pig >> mail # user >> error to generate a map?


+
Yang 2012-06-08, 03:51
+
Alan Gates 2012-06-15, 15:58
+
Yang 2012-06-20, 22:56
+
Yang 2012-06-20, 23:00
+
Yang 2012-06-20, 23:28
Copy link to this message
-
Re: error to generate a map?
Seems works in trunk.

On Wed, Jun 20, 2012 at 4:28 PM, Yang <[EMAIL PROTECTED]> wrote:

> a related  issue is
>
>
> grunt> a = load 'a' as (x:chararray, y:chararray);
> grunt> b = foreach a generate { (x,x),(y,y) } as ff:bag{
> tt:tuple(hh:chararray, yy:chararray)};
> 2012-06-20 16:24:53,053 [main] ERROR org.apache.pig.tools.grunt.Grunt -
> ERROR 1200: Pig script failed to parse:
> <line 2, column 4> pig script failed to validate:
> org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1031:
> Incompatable field schema: left is
> "ff:bag{tt:tuple(hh:chararray,yy:chararray)}", right is
> ":bag{:tuple(:NULL)}"
> Details at logfile:
>
> /crypt/yyang_home/work/matching/ML-places-matcher/control_flows/match_suspects/pig_1340234677873.log
>
>
>
> here if we pass 2 vars which have the same schema into the bag , pig
> calculates a schema bag{:tuple(:NULL)} instead of realizing that both (x,x)
> and (y,y) are of type (:chararray, :chararray).
>
> currently I'm using a work around to generate each bag separately and later
> UNION them
>
>
> thanks
> Yang
>
> On Wed, Jun 20, 2012 at 4:00 PM, Yang <[EMAIL PROTECTED]> wrote:
>
> > actually I found that the git_hub version of pig works fine for these
> > examples
> >
> >
> > likely some bug that was fixed
> >
> >
> > On Wed, Jun 20, 2012 at 3:56 PM, Yang <[EMAIL PROTECTED]> wrote:
> >
> >> thanks Alan
> >>
> >> the problem does not seem to be string
> >>
> >> if I have
> >>
> >> a = load 'a' as x:chararray;
> >>
> >> b = foreach a generate (['mykey'#x]);
> >>
> >> it would fail
> >> while if u change the x to 1    it works fine
> >>
> >>
> >> the same issue of constants vs var is also seen in the following example
> >> for bag:
> >>
> >> grunt>  b = foreach a generate {(x),(x)} as  bb:bag{T:tuple ( yy:int)};
> >>
> >> 2012-06-20 15:54:08,053 [main] ERROR org.apache.pig.tools.grunt.Grunt -
> >> ERROR 1200: <line 5, column 25>  rule null_keyword failed predicate:
> >> {input.LT(1).getText().equalsIgnoreCase("NULL")}?
> >> Details at logfile:
> >>
> /crypt/yyang_home/work/matching/ML-places-matcher/control_flows/match_suspects/pig_1340232590787.log
> >> grunt>  b = foreach a generate {(1),(1)} as  bb:bag{T:tuple ( yy:int)};
> >>
> >> the second one succeeds
> >>
> >>
> >> Thanks
> >> Yang
> >>
> >> On Fri, Jun 15, 2012 at 8:58 AM, Alan Gates <[EMAIL PROTECTED]
> >wrote:
> >>
> >>> Maps require string keys.  So it should read ['222'#1].
> >>>
> >>> Alan.
> >>>
> >>> On Jun 7, 2012, at 8:51 PM, Yang wrote:
> >>>
> >>> > I ran the following simple pig script
> >>> >
> >>> >
> >>> > a = load 'a';
> >>> >
> >>> > b = foreach a generate [222#1];
> >>> >
> >>> > dump b;
> >>> >
> >>> >
> >>> > but it gave the following error
> >>> >
> >>> > $ pig -x local  a.pig
> >>> > 2012-06-07 20:49:13,039 [main] INFO  org.apache.pig.Main - Logging
> >>> error
> >>> > messages to:
> >>> >
> >>>
> /crypt/yyang_home/work/matching/ML-places-matcher/control_flows/match_suspects/pig_1339127353036.log
> >>> > 2012-06-07 20:49:13,168 [main] INFO
> >>> > org.apache.pig.backend.hadoop.executionengine.HExecutionEngine -
> >>> > Connecting to hadoop file system at: file:///
> >>> > 2012-06-07 20:49:13,692 [main] ERROR
> org.apache.pig.tools.grunt.Grunt -
> >>> > ERROR 1200: <file a.pig, line 3, column 23>  Syntax error, unexpected
> >>> > symbol at or near '['
> >>> >
> >>> >
> >>> >
> >>> > but this script is basically the same as the one copied from manual:
> >>> > http://pig.apache.org/docs/r0.9.2/basic.html
> >>> >
> >>> > A = LOAD 'data' USING MyStorage() AS (T: tuple(name:chararray, age:
> >>> int));
> >>> > B = FILTER A BY T == ('john', 25);
> >>> > D = FOREACH B GENERATE T.name, [25#5.6], {(1, 5, 18)};
> >>> >
> >>> >
> >>> >
> >>> > how could I fix it?
> >>> >
> >>> >
> >>> >
> >>> > thanks!
> >>> >
> >>> > yang
> >>>
> >>>
> >>
> >
>