Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Pig >> mail # dev >> [VOTE] Release Pig 0.10.0 (candidate 0)


Copy link to this message
-
Re: [VOTE] Release Pig 0.10.0 (candidate 0)
Thanks Dmitriy, that works. But I am wondering why the behavior is
different from the previous versions.

Difference I see in bin/pig is (0.10.0 vs 0.9.1)

> # add HADOOP_CONF_DIR
> if [ "$HADOOP_CONF_DIR" != "" ]; then
>     CLASSPATH=${CLASSPATH}:${HADOOP_CONF_DIR}
> fi

AFAIK, this should not affect it - all it's doing is adding the conf dir to
the classpath which I was doing earlier through PIG_CLASSPATH in my wrapper
script.

The issue here is that certain properties are not same between client
machine and remote cluster, for eg JAVA_HOME. Since pig is client side it
made sense for Pig to not pick up any cluster properties from
"hadoop-env.sh". I am not sure what the change here is that's now causing
it to be picked up.

On Mon, Apr 23, 2012 at 9:14 PM, Dmitriy Ryaboy <[EMAIL PROTECTED]> wrote:

> pig.sh understands the arguments below -- try setting HADOOP_CONF_DIR?
>
> # Environment Variables
> #
> #     JAVA_HOME                The java implementation to use.
> Overrides JAVA_HOME.
> #
> #     PIG_CLASSPATH Extra Java CLASSPATH entries.
> #
> #     HADOOP_HOME/HADOOP_PREFIX     Environment
> HADOOP_HOME/HADOOP_PREFIX(0.20.205)
> #
> #     HADOOP_CONF_DIR     Hadoop conf dir
> #
> #     PIG_HEAPSIZE    The maximum amount of heap to use, in MB.
> #                                        Default is 1000.
> #
> #     PIG_OPTS            Extra Java runtime options.
> #
> #     PIG_CONF_DIR    Alternate conf dir. Default is ${PIG_HOME}/conf.
> #
> #     HBASE_CONF_DIR - Optionally, the HBase configuration to run against
> #                      when using HBaseStorage
>
>
>
> On Mon, Apr 23, 2012 at 8:45 PM, Prashant Kommireddi
> <[EMAIL PROTECTED]> wrote:
> > I have a wrapper script to switch between Pig versions and clusters.
> >
> > export PIG_HOME=$HOME/tools/Linux/hadoop/pig-$PIG_VERSION
> > export JAVA_HOME=$HOME/tools/Linux/jdk/jdk$JAVA_VERSION/
> > export
> PIG_CLASSPATH=$HOME/apps/gridforce/main/hadoop/conf/$HADOOP_CLUSTER
> >
> > HADOOP_CLUSTER contains the hadoop configs (endpoints) for the cluster I
> > want to point to.
> >
> > And then I do this to start pig.
> >
> > $PIG_HOME/bin/pig
> >
> > This works with previous versions pig-0.8.0 and pig-0.9.1. However,
> > pig-0.10.0 fails to pick up the right classpath.
> >
> > $ ./pig.sh
> >  ..
> > .....
> > ......
> > 2012-04-23 20:42:35,340 [main] INFO
> > org.apache.pig.backend.hadoop.executionengine.HExecutionEngine -
> Connecting
> > to hadoop file system at: file:///
> >
> > Is this something with my script, or may be the new version (0.10.0)?
> >
> > -Prashant
> >
> >
> > On Mon, Apr 23, 2012 at 8:30 PM, Dmitriy Ryaboy <[EMAIL PROTECTED]>
> wrote:
> >
> >> Uh, actually, one of the test-commit tests failed in my environment.
> >>
> >> In TestPigServer:
> >>
> >> Testcase: testDefaultPigProperties took 0.033 sec
> >>        Caused an ERROR
> >> null
> >> java.lang.NullPointerException
> >>        at
> >>
> org.apache.pig.test.TestPigServer.testDefaultPigProperties(TestPigServer.java:895)
> >>
> >> Something about my environment?
> >>
> >> D
> >>
> >>
> >> On Mon, Apr 23, 2012 at 6:36 PM, Dmitriy Ryaboy <[EMAIL PROTECTED]>
> >> wrote:
> >> > +1
> >> >
> >> >
> >> > Verified several jobs using Elephant-Bird loaders.
> >> > Tested correctness with pig.exec.mapPartAgg both true and false.
> >> > Verified license.
> >> > Verified release notes.
> >> > Ran test-commit
> >> >
> >> > D
> >> >
> >> > On Sat, Apr 21, 2012 at 12:27 PM, Daniel Dai <[EMAIL PROTECTED]>
> >> wrote:
> >> >> We should do sanity check of the package, such as unit tests, e2e
> >> >> tests, piggybank tests, package integrity, package signature,
> license,
> >> >> etc. However, if we find a new bug, usually we will push it to the
> >> >> next release at this stage unless it is a critical one.
> >> >>
> >> >> Thanks,
> >> >> Daniel
> >> >>
> >> >> On Sat, Apr 21, 2012 at 12:48 AM, Prashant Kommireddi
> >> >> <[EMAIL PROTECTED]> wrote:
> >> >>> Hi Daniel,
> >> >>>
> >> >>> What is required other than running the regular tests for testing