Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive >> mail # user >> FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.FunctionTask


Copy link to this message
-
Re: FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.FunctionTask
auxpath is not going to solve this problem. You have to set your
HADOOP_CLASSPATH for it. From the error it seems that your job client is
not able to load the class from classpath.

Try -
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/reports/hive/ddc_jars/jwnl.jar

$ hive

hive> add jar ...
hive> create temporary ..

In my last email, I mentioned about auxlib directory, hive shell scripts
picks up all the jars in this directory and puts it in HADOOP_CLASSPATH and
auxpath.

thanks,
Aniket

On Sun, Jan 22, 2012 at 2:08 PM, Tim Havens <[EMAIL PROTECTED]> wrote:

> hive --auxpath /reports/hive/ddc_jars/jwnl.jar
> Hive history
> file=/tmp/thavens/hive_job_log_thavens_201201222205_2003418921.txt
> hive> create temporary function StemTermsUDF as
> 'org.apache.hadoop.hive.ql.udf.StemTermsUDF';
> OK
> Time taken: 0.005 seconds
>
> however:
>
> With in .hiverc with:
> add jar /reports/hive/ddc_jars/jwnl.jar
>
> AND
> /etc/hive/conf/hive-site.xml
>
> <property>
>   <name>hive.aux.jars.path</name>
>   <value>/reports/hive/ddc_jars</value>
> </property>
>
> results in:
>
> hive> create temporary function StemTermsUDF as
> 'org.apache.hadoop.hive.ql.udf.StemTermsUDF';
> java.lang.NoClassDefFoundError: net/didion/jwnl/JWNLException
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:264)
>         at
> org.apache.hadoop.hive.ql.exec.FunctionTask.getUdfClass(FunctionTask.java:119)
>         at
> org.apache.hadoop.hive.ql.exec.FunctionTask.createFunction(FunctionTask.java:75)
>         at
> org.apache.hadoop.hive.ql.exec.FunctionTask.execute(FunctionTask.java:63)
>         at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:130)
>         at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1063)
>         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:900)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:748)
>         at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:209)
>         at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:286)
>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:516)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:616)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:186)
> Caused by: java.lang.ClassNotFoundException: net.didion.jwnl.JWNLException
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>         ... 18 more
> FAILED: Execution Error, return code -101 from
> org.apache.hadoop.hive.ql.exec.FunctionTask
>
> On Sun, Jan 22, 2012 at 3:43 PM, Tim Havens <[EMAIL PROTECTED]> wrote:
>
>> Unfortunately the issue appears to be something with the Jar, or my UDF.
>>
>> What I can't seem to resolve is what is causing the -101 Error Code.
>>
>> Tim
>>
>>
>> On Sun, Jan 22, 2012 at 3:26 PM, Aniket Mokashi <[EMAIL PROTECTED]>wrote:
>>
>>> A simplest way would be to put the jar in auxlib directory. That does
>>> the both for you I guess. After that you can directly create temporary
>>> function in hive.
>>>
>>> ~Aniket
>>>
>>>
>>> On Sun, Jan 22, 2012 at 1:24 PM, Aniket Mokashi <[EMAIL PROTECTED]>wrote:
>>>
>>>> Add the jar to HADOOP_CLASSPATH when you launch hive. That should help.
>>>>
>>>> Thanks,
>>>> Aniket
>>>>
>>>>
>>>> On Sun, Jan 22, 2012 at 9:25 AM, Tim Havens <[EMAIL PROTECTED]>wrote:

"...:::Aniket:::... Quetzalco@tl"
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB