Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> writing to hdfs via java api

Copy link to this message
writing to hdfs via java api
I found a way to connect to hadoop via hftp, and it works fine, (read only)

    uri = "hftp://172.16.xxx.xxx:50070/";

    System.out.println( "uri: " + uri );
    Configuration conf = new Configuration();

    FileSystem fs = FileSystem.get( URI.create( uri ), conf );

However, it appears that hftp is read only, and I want to read/write as well
as copy files, that is, I want to connect over hdfs . How can I enable hdfs
connections so that i can edit the actual , remote filesystem using the file
/ path's APIs  ?  Are there ssh settings that have to be set before i can do
this > ?

I tried to change the protocol above from "hftp" -> "hdfs", but I got the
following exception ...

Exception in thread "main" java.io.IOException: Call to / failed on local exception: java.io.EOFException at
org.apache.hadoop.ipc.Client.wrapException(Client.java:1139) at
org.apache.hadoop.ipc.Client.call(Client.java:1107) at
org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226) at
$Proxy0.getProtocolVersion(Unknown Source) at
org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398) at
org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384) at
org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111) at
org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:213) at
org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:180) at
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514) at
org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67) at
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1548) at
org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1530) at
org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228) at