Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Running Hadoop client as a different user


Copy link to this message
-
Re: Running Hadoop client as a different user
Am not sure I'm getting your problem yet, but mind sharing the error
you see specifically? That'd give me more clues.

On Fri, May 17, 2013 at 2:39 PM, Steve Lewis <[EMAIL PROTECTED]> wrote:
> Here is the issue -
> 1 - I am running a Java client on a machine unknown to the cluster - my
> default name on this pc is
> HYPERCHICKEN\local_admin - the name known to the cluster is slewis
>
> 2 Thew listed code
>   String connectString =   "hdfs://" + host + ":" + port + "/";*
>            Configuration config = new Configuration();*
>            config.set("fs.default.name",connectString);*
>               FileSystem fs  = FileSystem.get(config);*
>
> Arttempts to get a file system - it has not (to the best of my knowledge)
> altered the cluster -
> Yes, the next code will attempt to write files in a directory where I may
> have permission - at least slewis does but
> I cannot even get the file system
>
>
>
> This is the relevant section of  hdfs-site.xml
> <!-- Permissions configuration -->
> <property>
> <name>dfs.umaskmode</name>
> <value>077</value>
> <description>
> The octal umask used when creating files and directories.
> </description>
> </property>
>
> <property>
> <name>dfs.block.access.token.enable</name>
> <value>false</value>
> <description>
> Are access tokens are used as capabilities for accessing datanodes.
> </description>
> </property>
>
> <property>
> <name>dfs.namenode.kerberos.principal</name>
> <value>nn/_HOST@${local.realm}</value>
> <description>
> Kerberos principal name for the NameNode
> </description>
> </property>
>
> <property>
> <name>dfs.secondary.namenode.kerberos.principal</name>
> <value>nn/_HOST@${local.realm}</value>
> <description>
> Kerberos principal name for the secondary NameNode.
> </description>
> </property>
>
>
> <property>
> <name>dfs.namenode.kerberos.https.principal</name>
> <value>host/_HOST@${local.realm}</value>
> <description>
> The Kerberos principal for the host that the NameNode runs on.
> </description>
> </property>
>
> <property>
> <name>dfs.secondary.namenode.kerberos.https.principal</name>
> <value>host/_HOST@${local.realm}</value>
> <description>
> The Kerberos principal for the hostthat the secondary NameNode runs on.
> </description>
> </property>
>
> <property>
> <name>dfs.secondary.https.port</name>
> <value>50490</value>
> <description>The https port where secondary-namenode binds</description>
>
> </property>
>
> <property>
> <name>dfs.datanode.kerberos.principal</name>
> <value>dn/_HOST@${local.realm}</value>
> <description>
> The Kerberos principal that the DataNode runs as. "_HOST" is replaced by
> the real host name.
> </description>
> </property>
>
> <property>
> <name>dfs.web.authentication.kerberos.principal</name>
> <value>HTTP/_HOST@${local.realm}</value>
> <description>
> The HTTP Kerberos principal used by Hadoop-Auth in the HTTP endpoint.
>
> The HTTP Kerberos principal MUST start with 'HTTP/' per Kerberos
> HTTP SPENGO specification.
> </description>
> </property>
>
> <property>
> <name>dfs.web.authentication.kerberos.keytab</name>
> <value>/etc/security/keytabs/nn.service.keytab</value>
> <description>
> The Kerberos keytab file with the credentials for the
> HTTP Kerberos principal used by Hadoop-Auth in the HTTP endpoint.
> </description>
> </property>
>
> <property>
> <name>dfs.namenode.keytab.file</name>
> <value>/etc/security/keytabs/nn.service.keytab</value>
> <description>
> Combined keytab file containing the namenode service and host principals.
> </description>
> </property>
>
> <property>
> <name>dfs.secondary.namenode.keytab.file</name>
> <value>/etc/security/keytabs/nn.service.keytab</value>
> <description>
> Combined keytab file containing the namenode service and host principals.
> </description>
> </property>
>
> <property>
> <name>dfs.datanode.keytab.file</name>
> <value>/etc/security/keytabs/dn.service.keytab</value>
> <description>
> The filename of the keytab file for the DataNode.
> </description>
> </property>
>
> <property>

Harsh J
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB