Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive >> mail # user >> Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask MapReduce Jobs Launched:


Copy link to this message
-
RE: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask MapReduce Jobs Launched:
Hello Dear Mohammad :-),

Its O.K :-), Nitin & Bejoyed helped me a lot :-).
Yes I am using pseudo distributed mode.

Thanks & Regards
Yogesh Kumar
:-)
________________________________________
From: Mohammad Tariq [[EMAIL PROTECTED]]
Sent: Tuesday, July 24, 2012 5:29 PM
To: [EMAIL PROTECTED]
Subject: Re: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask MapReduce Jobs Launched:

Hi yogesh,

     Sorry for being unresponsive. Have a tight schedule today. By
looking at your logs, It seems you are running your cluster in pseudo
distributed mode. If that is the case just make sure you have
commented the following line in your /etc/hosts file
127.0.1.1

Regards,
    Mohammad Tariq
On Tue, Jul 24, 2012 at 5:21 PM, Nitin Pawar <[EMAIL PROTECTED]> wrote:
> if you have a single node cluster, then just configure everything to
> localhost. But if you have two nodes then thats a problem cause in any
> cluster nodes need to talk with each other without authentication on
> network level
>
> On Tue, Jul 24, 2012 at 4:19 PM,  <[EMAIL PROTECTED]> wrote:
>> Thanks Nitin :-), Thank you very much.
>>
>> Yes my n/w setup with proxy authentication, if I do remove proxy foe HTTP, HTTPS then it works but the Internet goes down.
>>
>> I set the proxy bypass for machine and it worked. :-)
>>
>> Thanks a Lot
>> Yogesh Kumar
>> ________________________________________
>> From: Nitin Pawar [[EMAIL PROTECTED]]
>> Sent: Tuesday, July 24, 2012 2:13 PM
>> To: [EMAIL PROTECTED]
>> Subject: Re: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask MapReduce Jobs Launched:
>>
>> yogesh,
>> is your network setup with proxy authentication? normally the error
>> which you are getting is, when a node tries to connect to another node
>> but then after some time, the connection needs to be reauthorized.
>>
>> when you do a select * from table, it does not launch a MR, it just
>> does a dfs -cat of the hdfs file
>>
>> you will need to check the cluster health of your hadoop cluster. Also
>> make sure that on all the nodes the default linux security is off
>> (iptables )
>>
>> On Tue, Jul 24, 2012 at 2:09 PM,  <[EMAIL PROTECTED]> wrote:
>>> Thanks Mohammad :-),
>>>
>>> I didn't get regarding which proxy settings Please explain it,
>>>
>>> If i do operations like.
>>> select * from dummysite,
>>> or
>>> select * from dummysite where id=10;
>>>
>>> such command shows proper result and doesn't through any error.
>>>
>>> Please suggest.
>>>
>>> Regards
>>> Yogesh Kumar
>>> ________________________________________
>>> From: Mohammad Tariq [[EMAIL PROTECTED]]
>>> Sent: Tuesday, July 24, 2012 2:00 PM
>>> To: [EMAIL PROTECTED]
>>> Subject: Re: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask MapReduce Jobs Launched:
>>>
>>> Hello Yogesh,
>>>
>>>         Are any proxy settings involved??Error code 407 indicates that
>>> the client must first authenticate itself with the proxy in order to
>>> proceed further. Just make sure everything is in place.
>>>
>>> Regards,
>>>     Mohammad Tariq
>>>
>>>
>>> On Tue, Jul 24, 2012 at 1:55 PM,  <[EMAIL PROTECTED]> wrote:
>>>> Hi All,
>>>>
>>>> even If if perform count(*) operation on table it shows error,
>>>>
>>>> select count(*) from dummysite;
>>>>
>>>>
>>>> Total MapReduce jobs = 1
>>>> Launching Job 1 out of 1
>>>> Number of reduce tasks determined at compile time: 1
>>>> In order to change the average load for a reducer (in bytes):
>>>>   set hive.exec.reducers.bytes.per.reducer=<number>
>>>> In order to limit the maximum number of reducers:
>>>>   set hive.exec.reducers.max=<number>
>>>> In order to set a constant number of reducers:
>>>>   set mapred.reduce.tasks=<number>
>>>> Starting Job = job_201207231123_0011, Tracking URL >>>> http://localhost:50030/jobdetails.jsp?jobid=job_201207231123_0011
>>>> Kill Command = /HADOOP/hadoop-0.20.2/bin/../bin/hadoop job
>>>> -Dmapred.job.tracker=localhost:9001 -kill job_201207231123_0011

Please do not print this email unless it is absolutely necessary.

The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.

WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.

www.wipro.com