Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Pig, mail # user - Trying to get pig 0.11/0.12 working to solve 0.10's issues with python udf


+
Michał Czerwiński 2012-11-12, 16:47
Copy link to this message
-
Re: Trying to get pig 0.11/0.12 working to solve 0.10's issues with python udf
Cheolsoo Park 2012-11-12, 17:37
Hi Michal,

Caused by: java.lang.IllegalArgumentException: Can not create a Path
from an empty string
at org.apache.hadoop.fs.Path.checkPathArg(Path.java:82)
at org.apache.hadoop.fs.Path.<init>(Path.java:90)
at org.apache.hadoop.fs.Path.<init>(Path.java:45

Your error message indicates that there is a typo somewhere in paths. I
believe that your PIG_CLASSPATH is the problem:

PIG_CLASSPATH=$HCAT_HOME/share/hcatalog/hcatalog-0.4.0.jar::$HIVE_HOME/conf:$HADOOP_HOME/conf

You have a double colon :: in the middle, and that will be interpreted as
an empty string.

Thanks,
Cheolsoo

On Mon, Nov 12, 2012 at 8:47 AM, Michał Czerwiński <[EMAIL PROTECTED]
> wrote:

> I am trying to use pig 0.11 and pig trunk (currently 0.12) because pig 0.10
> seems to be having issues with python udf...
>
> According to this
> http://www.mail-archive.com/[EMAIL PROTECTED]/msg05837.html
>
> " after replacing pig.jar and pig-withouthadoop.jar with the
> 0.11 ones from the svn trunk, they work like a charm."
>
> Well this is clearly not the case for me...
>
> The error I get is:
>
> Pig Stack Trace
> ---------------
> ERROR 2017: Internal error creating job configuration.
>
> org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to
> open iterator for alias ll
>  at org.apache.pig.PigServer.openIterator(PigServer.java:841)
> at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:696)
>  at
>
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:320)
> at
>
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
>  at
>
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
> at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
>  at org.apache.pig.Main.run(Main.java:535)
> at org.apache.pig.Main.main(Main.java:154)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>  at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
>  at org.apache.hadoop.util.RunJar.main(RunJar.java:197)
> Caused by: org.apache.pig.PigException: ERROR 1002: Unable to store alias
> ll
> at org.apache.pig.PigServer.storeEx(PigServer.java:940)
>  at org.apache.pig.PigServer.store(PigServer.java:903)
> at org.apache.pig.PigServer.openIterator(PigServer.java:816)
>  ... 12 more
> Caused by:
>
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobCreationException:
> ERROR 2017: Internal error creating job configuration.
>  at
>
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:848)
> at
>
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:294)
>  at
>
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:177)
> at org.apache.pig.PigServer.launchPlan(PigServer.java:1269)
>  at
> org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1254)
> at org.apache.pig.PigServer.storeEx(PigServer.java:936)
>  ... 14 more
> Caused by: java.lang.IllegalArgumentException: Can not create a Path from
> an empty string
> at org.apache.hadoop.fs.Path.checkPathArg(Path.java:82)
>  at org.apache.hadoop.fs.Path.<init>(Path.java:90)
> at org.apache.hadoop.fs.Path.<init>(Path.java:45)
>  at
>
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.shipToHDFS(JobControlCompiler.java:1455)
> at
>
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.putJarOnClassPathThroughDistributedCache(JobControlCompiler.java:1432)
>  at
>
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:508)
> ... 19 more
>
> ===============================================================================>
> I am using the following startup script:
+
Michał Czerwiński 2012-11-12, 17:59
+
Cheolsoo Park 2012-11-12, 18:09
+
Michał Czerwiński 2012-11-12, 18:29
+
Cheolsoo Park 2012-11-12, 18:45
+
Michał Czerwiński 2012-11-13, 15:16
+
Michał Czerwiński 2012-11-13, 15:40
+
Cheolsoo Park 2012-11-13, 17:18
+
Michał Czerwiński 2012-11-13, 17:34