Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hive >> mail # user >> Prevent users from killing each other's jobs

Copy link to this message
Re: Prevent users from killing each other's jobs

You need to set up Job ACLs. See http://hadoop.apache.org/docs/stable/mapred_tutorial.html#Job+Authorization.

It is a per job configuration, you can provide with defaults. If the job owner wishes to give others access, he/she can do so.

+Vinod Kumar Vavilapalli
Hortonworks Inc.

On Jul 30, 2013, at 11:21 AM, Murat Odabasi wrote:

> Hi there,
> I am trying to introduce some sort of security to prevent different
> people using the cluster from interfering with each other's jobs.
> Following the instructions at
> http://hadoop.apache.org/docs/stable/cluster_setup.html and
> https://www.inkling.com/read/hadoop-definitive-guide-tom-white-3rd/chapter-9/security
> , this is what I put in my mapred-site.xml:
> <property>
>  <name>mapred.task.tracker.task-controller</name>
>  <value>org.apache.hadoop.mapred.LinuxTaskController</value>
> </property>
> <property>
>  <name>mapred.acls.enabled</name>
>  <value>true</value>
> </property>
> I can see the configuration parameters in the job configuration when I
> run a hive query, but the users are still able to kill each other's
> jobs.
> Any ideas about what I may be missing?
> Any alternative approaches I can adopt?
> Thanks.