Hi Pinak -
If you want to use the REST interface of webhdfs then you can setup Knox as
the Hadoop REST Gateway and authentication against LDAP or other stores
through the Apache Shiro integration. This opens up your authentication
It would then proxy your access to HDFS and the rest of Hadoop through the
If you intend to only use the Hadoop command like tooling then you are
limited to only Kerberos for real authentication.
On Fri, Jan 10, 2014 at 3:31 AM, Juan Carlos <[EMAIL PROTECTED]> wrote:
> As far as I know, the only authentication method available in hdfs 2.2.0
> is Kerberos, so it's not possible to authenticate with an URL.
> 2014/1/10 Pinak Pani <[EMAIL PROTECTED]>
>> Does HDFS provide any build in authentication out of the box? I wanted to
>> make explicit access to HDFS from Java. I wanted people to access HDFS
>> using "username:password@hdfs://client.skynet.org:9000/user/data" or
>> something like that.
>> I am new to Hadoop. We are planning to use Hadoop mainly for Archiving
>> and probably processing at a later time. The idea is customers can setup
>> their own HDFS cluster and provide us the HDFS URL to dump the data to.
>> Is it possible to have access to HDFS in a similar way we access
>> databases using credential?
NOTICE: This message is intended for the use of the individual or entity to
which it is addressed and may contain information that is confidential,
privileged and exempt from disclosure under applicable law. If the reader
of this message is not the intended recipient, you are hereby notified that
any printing, copying, dissemination, distribution, disclosure or
forwarding of this communication is strictly prohibited. If you have
received this communication in error, please contact the sender immediately
and delete it from your system. Thank You.