Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Connect to HDFS running on a different Hadoop-Version


Copy link to this message
-
Re: Connect to HDFS running on a different Hadoop-Version
BigInsights? ... Ok, I'll be nice ...  :-)

Ok, so of I understand your question, you want to use a single HDFS file system to be used by different 'Hadoop' frameworks ? (derivatives)

First, it doesn't make sense. I mean it really doesn't make any sense.

Second.. I don't think it would be possible except in the rare case that the two flavors of Hadoop were from the same code stream and similar release level. As a hypothetical example, Oracle forks their own distro from Cloudera but makes relatively few changes under the hood.

But getting back to the first point... Not a good idea when you considers that it violates the KISS principle to design.

IMHO, you would be better off w two clusters using distcp.

Sent from my iPhone

On Jan 25, 2012, at 5:38 AM, "Romeo Kienzler" <[EMAIL PROTECTED]> wrote:

> Dear List,
>
> we're trying to use a central HDFS storage in order to be accessed from various other Hadoop-Distributions.
>
> Do you think this is possible? We're having trouble, but not related to different RPC-Versions.
>
> When trying to access a Cloudera CDH3 Update 2 (cdh3u2) HDFS from BigInsights 1.3 we're getting this error:
>
> Bad connection to FS. Command aborted. Exception: Call to localhost.localdomain/127.0.0.1:50070 failed on local exception: java.io.EOFException
> java.io.IOException: Call to localhost.localdomain/127.0.0.1:50070 failed on local exception: java.io.EOFException
>        at org.apache.hadoop.ipc.Client.wrapException(Client.java:1142)
>        at org.apache.hadoop.ipc.Client.call(Client.java:1110)
>        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
>        at $Proxy0.getProtocolVersion(Unknown Source)
>        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398)
>        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384)
>        at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111)
>        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:213)
>        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:180)
>        at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>        at com.ibm.biginsights.hadoop.patch.PatchedDistributedFileSystem.initialize(PatchedDistributedFileSystem.java:19)
>        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514)
>        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
>        at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1548)
>        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1530)
>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228)
>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:111)
>        at org.apache.hadoop.fs.FsShell.init(FsShell.java:82)
>        at org.apache.hadoop.fs.FsShell.run(FsShell.java:1785)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>        at org.apache.hadoop.fs.FsShell.main(FsShell.java:1939)
> Caused by: java.io.EOFException
>        at java.io.DataInputStream.readInt(DataInputStream.java:375)
>        at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:815)
>        at org.apache.hadoop.ipc.Client$Connection.run(Client.java:724)
>
>
> But we've already replaced the client hadoop-common.jar's with the Cloudera ones.
>
> Please note also that we're getting an EOFException and not an RPC.VersionMismatch.
>
> FsShell.java:
>
>        try {
>            init();
>        } catch (RPC.VersionMismatch v) {
>            System.err.println("Version Mismatch between client and server"
>                    + "... command aborted.");
>            return exitCode;
>        } catch (IOException e) {
>            System.err.println("Bad connection to FS. command aborted.");
>            System.err
>                    .println("Bad connection to FS. Command aborted. Exception: "
>                            + e.getLocalizedMessage());