Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive, mail # user - Re: java.lang.NoClassDefFoundError: com/jayway/jsonpath/PathUtil


Copy link to this message
-
Re: java.lang.NoClassDefFoundError: com/jayway/jsonpath/PathUtil
Sai Sai 2013-03-10, 12:26
Ramki/John
Many Thanks, that really helped. I have run the add jars in the new session and it appears to be running. However i was wondering about by passing MR, why would we do it and what is the use of it. Will appreciate any input.
Thanks
Sai
________________________________
 From: Ramki Palle <[EMAIL PROTECTED]>
To: [EMAIL PROTECTED]; Sai Sai <[EMAIL PROTECTED]>
Sent: Sunday, 10 March 2013 4:22 AM
Subject: Re: java.lang.NoClassDefFoundError: com/jayway/jsonpath/PathUtil
 

When you execute the following query,

hive> select * from twitter limit 5;

Hive runs it in local mode and not use MapReduce.

For the query,

hive> select tweet_id from twitter limit 5;

I think you need to add JSON jars to overcome this error. You might have added these in a previous session. If you want these jars available for all sessions, insert the add jar statements to your $HOME/.hiverc file.

To bypass MapReduce

set hive.exec.mode.local.auto = true;

to suggest Hive to use local mode to execute the query. If it still uses MR, try

set hive.fetch.task.conversion = more;.
-Ramki.

On Sun, Mar 10, 2013 at 12:19 AM, Sai Sai <[EMAIL PROTECTED]> wrote:

Just wondering if anyone has any suggestions:
>
>
>This executes successfully:
>
>
>hive> select * from twitter limit 5;
>
>
>This does not work:
>
>
>hive> select tweet_id from twitter limit 5; // I have given the exception info below:
>
>
>
>Here is the output of this:
>
>
>hive> select * from twitter limit 5;
>OK
>
>
>
>tweet_id    created_at    text    user_id    user_screen_name    user_lang
>122106088022745088    Fri Oct 07 00:28:54 +0000 2011    wkwkw -_- ayo saja mba RT @yullyunet: Sepupuuu, kita lanjalan yok.. Kita karokoe-an.. Ajak mas galih jg kalo dia mau.. "@Dindnf: doremifas    124735434    Dindnf    en
>122106088018558976    Fri Oct 07 00:28:54 +0000 2011    @egg486 특별히 준비했습니다!    252828803    CocaCola_Korea    ko
>122106088026939392    Fri Oct 07 00:28:54 +0000 2011    My offer of free gobbies for all if @amityaffliction play Blair snitch project still
 stands.    168590073    SarahYoungBlood    en
>122106088035328001    Fri Oct 07 00:28:54 +0000 2011    the girl nxt to me in the lib got her headphones in dancing and singing loud af like she the only one here haha    267296295    MONEYyDREAMS_    en
>122106088005971968    Fri Oct 07 00:28:54 +0000 2011    @KUnYoong_B2UTY Bị lsao đấy    269182160    b2st_b2utyhp    en
>Time taken: 0.154 seconds
>
>
>
>This does not work:
>
>
>hive> select tweet_id from twitter limit 5;
>
>
>
>
>
>Total MapReduce jobs = 1
>Launching Job 1 out of 1
>Number of reduce tasks is set to 0 since there's no reduce operator
>Starting Job = job_201303050432_0094, Tracking URL = http://ubuntu:50030/jobdetails.jsp?jobid=job_201303050432_0094
>Kill Command = /home/satish/work/hadoop-1.0.4/libexec/../bin/hadoop job  -kill job_201303050432_0094
>Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
>2013-03-10 00:14:44,509 Stage-1 map = 0%,  reduce = 0%
>2013-03-10 00:15:14,613 Stage-1 map = 100%,  reduce = 100%
>Ended Job = job_201303050432_0094 with errors
>Error during job, obtaining debugging information...
>Job Tracking URL: http://ubuntu:50030/jobdetails.jsp?jobid=job_201303050432_0094
>Examining task ID: task_201303050432_0094_m_000002 (and more) from job job_201303050432_0094
>
>Task with the most failures(4):
>-----
>Task ID:
>  task_201303050432_0094_m_000000
>
>URL:
http://ubuntu:50030/taskdetails.jsp?jobid=job_201303050432_0094&tipid=task_201303050432_0094_m_000000
>-----
>Diagnostic Messages for this Task:
>java.lang.RuntimeException: Error in configuring object
>    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
>    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
>    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
>    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
>    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
 org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
 org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:121)
 org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)