Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive >> mail # user >> Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask MapReduce Jobs Launched:


Copy link to this message
-
Re: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask MapReduce Jobs Launched:
if you have a single node cluster, then just configure everything to
localhost. But if you have two nodes then thats a problem cause in any
cluster nodes need to talk with each other without authentication on
network level

On Tue, Jul 24, 2012 at 4:19 PM,  <[EMAIL PROTECTED]> wrote:
> Thanks Nitin :-), Thank you very much.
>
> Yes my n/w setup with proxy authentication, if I do remove proxy foe HTTP, HTTPS then it works but the Internet goes down.
>
> I set the proxy bypass for machine and it worked. :-)
>
> Thanks a Lot
> Yogesh Kumar
> ________________________________________
> From: Nitin Pawar [[EMAIL PROTECTED]]
> Sent: Tuesday, July 24, 2012 2:13 PM
> To: [EMAIL PROTECTED]
> Subject: Re: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask MapReduce Jobs Launched:
>
> yogesh,
> is your network setup with proxy authentication? normally the error
> which you are getting is, when a node tries to connect to another node
> but then after some time, the connection needs to be reauthorized.
>
> when you do a select * from table, it does not launch a MR, it just
> does a dfs -cat of the hdfs file
>
> you will need to check the cluster health of your hadoop cluster. Also
> make sure that on all the nodes the default linux security is off
> (iptables )
>
> On Tue, Jul 24, 2012 at 2:09 PM,  <[EMAIL PROTECTED]> wrote:
>> Thanks Mohammad :-),
>>
>> I didn't get regarding which proxy settings Please explain it,
>>
>> If i do operations like.
>> select * from dummysite,
>> or
>> select * from dummysite where id=10;
>>
>> such command shows proper result and doesn't through any error.
>>
>> Please suggest.
>>
>> Regards
>> Yogesh Kumar
>> ________________________________________
>> From: Mohammad Tariq [[EMAIL PROTECTED]]
>> Sent: Tuesday, July 24, 2012 2:00 PM
>> To: [EMAIL PROTECTED]
>> Subject: Re: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask MapReduce Jobs Launched:
>>
>> Hello Yogesh,
>>
>>         Are any proxy settings involved??Error code 407 indicates that
>> the client must first authenticate itself with the proxy in order to
>> proceed further. Just make sure everything is in place.
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>> On Tue, Jul 24, 2012 at 1:55 PM,  <[EMAIL PROTECTED]> wrote:
>>> Hi All,
>>>
>>> even If if perform count(*) operation on table it shows error,
>>>
>>> select count(*) from dummysite;
>>>
>>>
>>> Total MapReduce jobs = 1
>>> Launching Job 1 out of 1
>>> Number of reduce tasks determined at compile time: 1
>>> In order to change the average load for a reducer (in bytes):
>>>   set hive.exec.reducers.bytes.per.reducer=<number>
>>> In order to limit the maximum number of reducers:
>>>   set hive.exec.reducers.max=<number>
>>> In order to set a constant number of reducers:
>>>   set mapred.reduce.tasks=<number>
>>> Starting Job = job_201207231123_0011, Tracking URL >>> http://localhost:50030/jobdetails.jsp?jobid=job_201207231123_0011
>>> Kill Command = /HADOOP/hadoop-0.20.2/bin/../bin/hadoop job
>>> -Dmapred.job.tracker=localhost:9001 -kill job_201207231123_0011
>>> Hadoop job information for Stage-1: number of mappers: 1; number of
>>> reducers: 1
>>> 2012-07-24 13:38:18,928 Stage-1 map = 0%,  reduce = 0%
>>> 2012-07-24 13:38:21,938 Stage-1 map = 100%,  reduce = 0%
>>> 2012-07-24 13:39:22,170 Stage-1 map = 100%,  reduce = 0%
>>> 2012-07-24 13:39:25,181 Stage-1 map = 100%,  reduce = 100%
>>> Ended Job = job_201207231123_0011 with errors
>>> Error during job, obtaining debugging information...
>>> Examining task ID: task_201207231123_0011_m_000002 (and more) from job
>>> job_201207231123_0011
>>> Exception in thread "Thread-93" java.lang.RuntimeException: Error while
>>> reading from task log url
>>>     at
>>> org.apache.hadoop.hive.ql.exec.errors.TaskLogProcessor.getErrors(TaskLogProcessor.java:130)
>>>     at
>>> org.apache.hadoop.hive.ql.exec.JobDebugger.showJobFailDebugInfo(JobDebugger.java:211)
>>>     at org.apache.hadoop.hive.ql.exec.JobDebugger.run(JobDebugger.java:81)

Nitin Pawar