Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> issue with permissions of mapred.system.dir


Copy link to this message
-
Re: issue with permissions of mapred.system.dir
It's a known issue for fairscheduler in Hadoop 1.x, see
MAPREDUCE-4398. A workaround is to submit 4 or more jobs as the user
of the jobtracker and everything will work fine afterwards. BTW, IBM
BigInsights community version (open source) contains the right fix
(properly initialize job init threads) since BigInsights 1.3.0.1.
Unfortunately IBM devs are too busy to port/submit the patches to
Apache right now :)

__Luke

On Wed, Oct 10, 2012 at 9:32 AM, Goldstone, Robin J.
<[EMAIL PROTECTED]> wrote:
> There is no /hadoop1 directory.  It is //hadoop1 which is the name of the
> server running the name node daemon:
>
> <value>hdfs://hadoop1/mapred</value>
>
> Per offline conversations with Arpit, it appears this problem is related to
> the fact that I am using the fair scheduler.  The fair scheduler is designed
> to run map reduce jobs as the user, rather than under the mapred username.
> Apparently there are some issues with this scheduler related to permissions
> on certain directories not allowing other users to execute/write in places
> that are necessary for the job to run.  I haven't yet tried Arpit's
> suggestion to switch to the task scheduler but I imagine it will resolve my
> issue, at least for now.  Ultimately I do want to use the fair scheduler, as
> multi-tenancy is a key requirement for our Hadoop deployment.
>
> From: Manu S <[EMAIL PROTECTED]>
> Reply-To: "[EMAIL PROTECTED]" <[EMAIL PROTECTED]>
> Date: Wednesday, October 10, 2012 3:34 AM
> To: "[EMAIL PROTECTED]" <[EMAIL PROTECTED]>
> Subject: Re: issue with permissions of mapred.system.dir
>
> What is the permission for /hadoop1 dir in HDFS? Is "mapred" user have
> permission on the same directory?
>
> Thanks,
> Manu S
>
> On Wed, Oct 10, 2012 at 5:52 AM, Arpit Gupta <[EMAIL PROTECTED]> wrote:
>>
>> what is your "mapreduce.jobtracker.staging.root.dir" set to. This is a
>> directory that needs to be writable by the user and is is recommended to be
>> set to "/user" so it writes in appropriate users home directory.
>>
>> --
>> Arpit Gupta
>> Hortonworks Inc.
>> http://hortonworks.com/
>>
>> On Oct 9, 2012, at 4:44 PM, "Goldstone, Robin J." <[EMAIL PROTECTED]>
>> wrote:
>>
>> I am bringing up a Hadoop cluster for the first time (but am an
>> experienced sysadmin with lots of cluster experience) and running into an
>> issue with permissions on mapred.system.dir.   It has generally been a chore
>> to figure out all the various directories that need to be created to get
>> Hadoop working, some on the local FS, others within HDFS, getting the right
>> ownership and permissions, etc..  I think I am mostly there but can't seem
>> to get past my current issue with mapred.system.dir.
>>
>> Some general info first:
>> OS: RHEL6
>> Hadoop version: hadoop-1.0.3-1.x86_64
>>
>> 20 node cluster configured as follows
>> 1 node as primary namenode
>> 1 node as secondary namenode + job tracker
>> 18 nodes as datanode + tasktracker
>>
>> I have HDFS up and running and have the following in mapred-site.xml:
>> <property>
>>   <name>mapred.system.dir</name>
>>   <value>hdfs://hadoop1/mapred</value>
>>   <description>Shared data for JT - this must be in HDFS</description>
>> </property>
>>
>> I have created this directory in HDFS, owner mapred:hadoop, permissions
>> 700 which seems to be the most common recommendation amongst multiple, often
>> conflicting articles about how to set up Hadoop.  Here is the top level of
>> my filesystem:
>> hyperion-hdp4@hdfs:hadoop fs -ls /
>> Found 3 items
>> drwx------   - mapred hadoop          0 2012-10-09 12:58 /mapred
>> drwxrwxrwx   - hdfs   hadoop          0 2012-10-09 13:00 /tmp
>> drwxr-xr-x   - hdfs   hadoop          0 2012-10-09 12:51 /user
>>
>> Note, it doesn't seem to really matter what permissions I set on /mapred
>> since when the Jobtracker starts up it changes them to 700.
>>
>> However, when I try to run the hadoop example teragen program as a
>> "regular" user I am getting this error:
>> hyperion-hdp4@robing:hadoop jar /usr/share/hadoop/hadoop-examples*.jar