Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Re: Trying to copy file to Hadoop file system from a program


Copy link to this message
-
Re: Trying to copy file to Hadoop file system from a program


Greetings,

Below is the program i am trying to run and getting this exception:
***************************************

Test Start.....
java.net.UnknownHostException: unknown host: master
    at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:214)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:1196)
    at org.apache.hadoop.ipc.Client.call(Client.java:1050)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
    at $Proxy1.getProtocolVersion(Unknown Source)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
    at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
    at kelly.hadoop.hive.test.HadoopTest.main(HadoopTest.java:54)

********************
public class HdpTest {
   
    public static String fsURI = "hdfs://master:9000";

   
    public static void copyFileToDFS(FileSystem fs, String srcFile,
            String dstFile) throws IOException {
        try {
            System.out.println("Initialize copy...");
            URI suri = new URI(srcFile);
            URI duri = new URI(fsURI + "/" + dstFile);
            Path dst = new Path(duri.toString());
            Path src = new Path(suri.toString());
            System.out.println("Start copy...");
            fs.copyFromLocalFile(src, dst);
            System.out.println("End copy...");
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    public static void main(String[] args) {
        try {
            System.out.println("Test Start.....");
            Configuration conf = new Configuration();
            DistributedFileSystem fs = new DistributedFileSystem();
            URI duri = new URI(fsURI);
            fs.initialize(duri, conf); // Here is the xception occuring
            long start = 0, end = 0;
            start = System.nanoTime();
            //writing data from local to HDFS
            copyFileToDFS(fs, "/home/kosmos/Work/input/wordpair.txt",
                    "/input/raptor/trade1.txt");
            //Writing data from HDFS to Local
//             copyFileFromDFS(fs, "/input/raptor/trade1.txt", "/home/kosmos/Work/input/wordpair1.txt");
            end = System.nanoTime();
            System.out.println("Total Execution times: " + (end - start));
            fs.close();
        } catch (Throwable t) {
            t.printStackTrace();
        }
    }

******************************
I am trying to access in FireFox this url:

hdfs://master:9000

Get an error msg FF does not know how to display this message.

I can successfully access my admin page:

http://localhost:50070/dfshealth.jsp

Just wondering if anyone can give me any suggestions, your help will be really appreciated.
Thanks
Sai
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB