Thanks Michael S. We have tried that but as one cluster has Kerberos and
the other doesn't. We were unable to make it work and were able to connect
to one cluster at a time. In fact, we started from the same approach but
couldn't make it work and that is one of the reason that we asked this
question. I think the LinkedIn question that you are mentioning was indeed
posted by me.
We have even tried to connect to only HBase on the cluster with Kerberos
and simple making a HDFS connection (FileSystem.get) on the other without
security and it still doesn't work. It was these roadblocks that we faced
after which we posted these questions, just to make sure that we aren't
missing (making a mistake) something or whether it is even possible or not
(using the API in the wrong way).
But as Ted has mentioned that to connection two clusters with different
security settings is not an advisable or proper design anyway. So correct
me if I am wrong but what I now understand is that if the security
configuration is similar (both have Kerberos principals, even if different
ones OR both have 'simple AUTH'), then having 2 Configuration objects or
using distcp, both should work.
Like always, thanks.
On Sat, Apr 27, 2013 at 10:35 PM, Michael Segel
> This was answered on the linkedIn HBase discussion group.
> Create two configuration instances and then have each one point to the
> cluster you want to connect to.
> Then when you create an instance of HTable you use the correct
> On Apr 27, 2013, at 12:20 AM, Shahab Yunus <[EMAIL PROTECTED]> wrote:
> > Thanks Ted for the response. But the issue is that I want to read from
> > cluster and write to another. If I will have to clients then how will
> > communicate with each other? Essentially what am I trying to do here is
> > intra-cluster data copy/exchange. Any other ideas or suggestions? Even if
> > both servers have no security or one has Kerberos or both have
> > authentication how to exchange data between them?
> > I was actually not expecting that I cannot load multiple Hadoop or HBase
> > configurations in 2 different Configuration objects in one application.
> > As mentioned I have tried overwriting properties as well but
> > security/authentication properties are overwritten somehow.
> > Regards,
> > Shahab
> > On Fri, Apr 26, 2013 at 7:43 PM, Ted Yu <[EMAIL PROTECTED]> wrote:
> >> Looks like the easiest solution is to use separate clients, one for each
> >> cluster you want to connect to.
> >> Cheers
> >> On Sat, Apr 27, 2013 at 6:51 AM, Shahab Yunus <[EMAIL PROTECTED]
> >>> wrote:
> >>> Hello,
> >>> This is a follow-up to my previous post a few days back. I am trying to
> >>> connect to 2 different Hadoop clusters' setups through a same client
> >> I
> >>> am running into the issue that the config of one overwrites the other.
> >>> The scenario is that I want to read data from an HBase table from one
> >>> cluster and write it as a file on HDFS on the other. Individually, if I
> >> try
> >>> to write to them they both work but when I try this through a same Java
> >>> client, they fail.
> >>> I have tried loading the core-site.xml through addResource method of
> >>> Configuration class but only the first found config file is picked? I
> >> have
> >>> also tried by renaming the config files and then adding them as a
> >> resource
> >>> (again through the addResource method).
> >>> The situation is compounded by the fact that one cluster is using
> >> Kerberos
> >>> authentication and the other is not? If the Kerberos server's file is
> >> found
> >>> first then authentication failures are faced for the other server when
> >>> Hadoop tries to find client authentication information. If the 'simple'
> >>> cluster's config is loaded first then 'Authentication is Required'
> >> is
> >>> encountered against the Kerberos server.