Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Hadoop >> mail # user >> getting there (EOF exception).


+
Jay Vyas 2011-10-30, 23:47
+
Harsh J 2011-10-31, 03:48
+
JAX 2011-10-31, 04:08
+
Harsh J 2011-10-31, 04:21
Copy link to this message
-
Re: getting there (EOF exception).
Harsh ! that was thetrick !

I changed the fs.deault.name to 0.0.0.0. from "localhost".

Then, my java could easily connect with no problems to my remote hadoop
namenode !!!

Thanks !

In summary --- if you need to connect to the namenode remotely.... make
sure its serving to 0.0.0.0 and
not localhost, and not 127.0.0.1 (for those of you that are ignorant like
me, localhost != 0.0.0.0 ......

thank you thank you thank you

On Mon, Oct 31, 2011 at 12:21 AM, Harsh J <[EMAIL PROTECTED]> wrote:

> What is your fs.default.name set to? It'd bind to the hostname provided
> in that.
>
> On Mon, Oct 31, 2011 at 9:38 AM, JAX <[EMAIL PROTECTED]> wrote:
> > Thanks! Yes i agree ... But Are you sure 8020? 8020 serves on 127.0.0.1
> (rather than 0.0.0.0) ... Thus it is inaccessible to outside
> clients.......That is very odd.... Why would that be the case ? Any
> insights ( using cloud eras hadoop vm).
> >
> > Sent from my iPad
> >
> > On Oct 30, 2011, at 11:48 PM, Harsh J <[EMAIL PROTECTED]> wrote:
> >
> >> Hey Jay,
> >>
> >> I believe this may be related to your other issues as well, but 50070
> is NOT the port you want to connect to. 50070 serves over HTTP, while
> default port (fs.default.name), for IPC connections is 8020, or whatever
> you have configured.
> >>
> >> On 31-Oct-2011, at 5:17 AM, Jay Vyas wrote:
> >>
> >>> Hi  guys : What is the meaning of an EOF exception when trying to
> connect
> >>> to Hadoop by creating a new FileSystem object ?  Does this simply mean
> >>> the system cant be read ?
> >>>
> >>> java.io.IOException: Call to /172.16.112.131:50070 failed on local
> >>> exception: java.io.EOFException
> >>>   at org.apache.hadoop.ipc.Client.wrapException(Client.java:1139)
> >>>   at org.apache.hadoop.ipc.Client.call(Client.java:1107)
> >>>   at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
> >>>   at $Proxy0.getProtocolVersion(Unknown Source)
> >>>   at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398)
> >>>   at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384)
> >>>   at
> >>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111)
> >>>   at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:213)
> >>>   at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:180)
> >>>   at
> >>>
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
> >>>   at
> >>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514)
> >>>   at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
> >>>   at
> >>> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1548)
> >>>   at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1530)
> >>>   at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228)
> >>>   at sb.HadoopRemote.main(HadoopRemote.java:35)
> >>> Caused by: java.io.EOFException
> >>>   at java.io.DataInputStream.readInt(DataInputStream.java:375)
> >>>   at
> >>>
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:812)
> >>>   at org.apache.hadoop.ipc.Client$Connection.run(Client.java:720)
> >>>
> >>> --
> >>> Jay Vyas
> >>> MMSB/UCHC
> >>
> >
>
>
>
> --
> Harsh J
>

--
Jay Vyas
MMSB/UCHC
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB