Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce, mail # user - Impersonating HDFS user


+
Oleg Zhurakousky 2012-10-05, 13:19
+
Bertrand Dechoux 2012-10-05, 13:34
+
Oleg Zhurakousky 2012-10-05, 13:37
+
Oleg Zhurakousky 2012-10-05, 14:15
+
Bertrand Dechoux 2012-10-05, 14:33
+
Oleg Zhurakousky 2012-10-05, 14:40
+
Oleg Zhurakousky 2012-10-05, 14:42
Copy link to this message
-
Re: Impersonating HDFS user
Chris Nauroth 2012-10-05, 19:29
BTW, additional details on impersonation are here, including information
about a piece of configuration required to allow use of doAs.

http://hadoop.apache.org/docs/r1.0.3/Secure_Impersonation.html

Thank you,
--Chris

On Fri, Oct 5, 2012 at 7:42 AM, Oleg Zhurakousky <[EMAIL PROTECTED]
> wrote:

> sorry clicked send too soon, but basically changing that did not produce
> any result, still seeing the same message.So I guess my question is what is
> the property that is responsible for that?
>
> Thanks
> Oleg
>
>
> On Fri, Oct 5, 2012 at 10:40 AM, Oleg Zhurakousky <
> [EMAIL PROTECTED]> wrote:
>
>> Yes I understand that and I guess I am trying to find that 'right
>> property'
>> I did find one reference to it in hdfs-defaul.xml
>>
>> <name>dfs.datanode.address</name>
>>
>> <value>0.0.0.0:50010</value>
>>
>> so i changed that in my hdfs-site.xml to
>>
>> <name>dfs.datanode.address</name>
>>
>> <value>192.168.15.20:50010</value>
>>
>>
>> But
>>
>>
>> On Fri, Oct 5, 2012 at 10:33 AM, Bertrand Dechoux <[EMAIL PROTECTED]>wrote:
>>
>>> Indeed, you are connecting to localhost and you said it was a remote
>>> connection so I guess there is nothing there which is relevant for you.
>>> The main idea is that you need to provide the configuration files. They
>>> are read by default from the classpath. Any place where you have a
>>> Configuration/JobConf you could also set up the right properties which
>>> would be the location of the HDFS master (and mapred if you want to do
>>> something about it).
>>>
>>> Regards
>>>
>>> Bertrand
>>>
>>>
>>> On Fri, Oct 5, 2012 at 4:15 PM, Oleg Zhurakousky <
>>> [EMAIL PROTECTED]> wrote:
>>>
>>>> So now I am passed it and able to RunAs 'hduser', but when I attempt to
>>>> read from FSDataInputStream i see this message in my console
>>>>
>>>> 10:12:10,065  WARN main hdfs.DFSClient:2106 - Failed to connect to /
>>>> 127.0.0.1:50010, add to deadNodes and continuejava.net.ConnectException:
>>>> Connection refused
>>>>
>>>> 10:12:10,072  INFO main hdfs.DFSClient:2272 - Could not obtain block
>>>> blk_-4047236896256451627_1003 from any node: java.io.IOException: No
>>>> live nodes contain current block. Will get new block locations from
>>>> namenode and retry...
>>>>
>>>>
>>>> I am obviously missing a configuration setting somewhere. . . any idea?
>>>>
>>>> Thanks
>>>>
>>>> Oleg
>>>>
>>>> On Fri, Oct 5, 2012 at 9:37 AM, Oleg Zhurakousky <
>>>> [EMAIL PROTECTED]> wrote:
>>>>
>>>>> After i clicked send I found the same link ;), but thank you anyway.
>>>>>
>>>>> Oleg
>>>>>
>>>>>
>>>>> On Fri, Oct 5, 2012 at 9:34 AM, Bertrand Dechoux <[EMAIL PROTECTED]>wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> You might be looking for something like :
>>>>>> UserGroupInformation.createRemoteUser(user).doAs(
>>>>>>
>>>>>> see
>>>>>>
>>>>>> http://hadoop.apache.org/docs/r1.0.3/api/org/apache/hadoop/security/UserGroupInformation.html
>>>>>>
>>>>>> It is a JAAS wrapper for Hadoop.
>>>>>>
>>>>>> Regards
>>>>>>
>>>>>> Bertrand
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Oct 5, 2012 at 3:19 PM, Oleg Zhurakousky <
>>>>>> [EMAIL PROTECTED]> wrote:
>>>>>>
>>>>>>> I am working on some samples where I want to write to HDFS running
>>>>>>> on another machine (different OS etc.)
>>>>>>> The identity of my client process is just whatever my OS says it is
>>>>>>> (e.g., 'oleg') hence:
>>>>>>>
>>>>>>> 08:56:49,240 DEBUG IPC Client (47) connection to /
>>>>>>> 192.168.15.20:54310 from oleg ipc.Client:803 - IPC Client (47)
>>>>>>> connection to /192.168.15.20:54310 from oleg got value #2
>>>>>>>
>>>>>>> But there is no 'oleg' where the hadoop is running. Instead there is
>>>>>>> 'hduser'.
>>>>>>>
>>>>>>> Is there a way or an equivalent of "RunAs" in hadoop?
>>>>>>>
>>>>>>> Thanks
>>>>>>>  Oleg
>>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Bertrand Dechoux
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>> Bertrand Dechoux
>>>
>>
>>
>
+
Oleg Zhurakousky 2012-10-05, 23:58