Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Flume >> mail # user >> Automatically upload files into HDFS


+
kashif khan 2012-11-19, 10:44
+
Mohammad Tariq 2012-11-19, 10:50
+
kashif khan 2012-11-19, 12:30
+
Mohammad Tariq 2012-11-19, 12:34
+
Mohammad Tariq 2012-11-19, 12:35
+
Alexander Alten-Lorenz 2012-11-19, 12:26
+
kashif khan 2012-11-19, 12:35
+
Mohammad Tariq 2012-11-19, 12:41
+
kashif khan 2012-11-19, 12:53
+
Mohammad Tariq 2012-11-19, 13:18
+
kashif khan 2012-11-19, 14:01
+
Mohammad Tariq 2012-11-19, 14:04
+
kashif khan 2012-11-19, 14:29
+
kashif khan 2012-11-19, 14:43
+
Mohammad Tariq 2012-11-19, 14:53
+
kashif khan 2012-11-19, 15:01
+
Mohammad Tariq 2012-11-19, 15:10
+
kashif khan 2012-11-19, 15:34
+
Mohammad Tariq 2012-11-19, 15:41
+
kashif khan 2012-11-20, 10:40
Copy link to this message
-
Re: Automatically upload files into HDFS
Hello Kashif,

     You are correct. This because of some version mismatch. I am not using
CDH personally but AFAIK, CDH4 uses Hadoop-2.x.

Regards,
    Mohammad Tariq

On Tue, Nov 20, 2012 at 4:10 PM, kashif khan <[EMAIL PROTECTED]> wrote:

> HI M Tariq
>
>
> I am trying the following the program to create directory and copy file to
> hdfs. But I am getting the following errors
>
>
>
> Program:
>
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
> import java.io.IOException;
>
> public class CopyFile {
>
>
>         public static void main(String[] args) throws IOException{
>         Configuration conf = new Configuration();
>          conf.set("fs.default.name", "hadoop1.example.com:8020");
>         FileSystem dfs = FileSystem.get(conf);
>         String dirName = "Test1";
>         Path src = new Path(dfs.getWorkingDirectory() + "/" + dirName);
>         dfs.mkdirs(src);
>         Path scr1 = new Path("/usr/Eclipse/Output.csv");
>         Path dst = new Path(dfs.getWorkingDirectory() + "/Test1/");
>         dfs.copyFromLocalFile(src, dst);
>
>         }
>         }
>
>
>     Exception in thread "main" org.apache.hadoop.ipc.RemoteException:
> Server IPC version 7 cannot communicate with client version 4
>     at org.apache.hadoop.ipc.Client.call(Client.java:1070)
>     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>     at $Proxy1.getProtocolVersion(Unknown Source)
>     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>     at
> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
>     at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
>     at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
>     at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>     at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386)
>     at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>     at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404)
>     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
>     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
>     at CopyFile.main(CopyFile.java:11)
>
>
>
> I am using CDH4.1. i have download the source file of hadoop-1.0.4 and
> import the jar files into Eclipse. I think it is due to version problem.
> Could you please let me know what will be correct version for the CDH4.1?
>
> Many thanks
>
>
>
>
>
>
> On Mon, Nov 19, 2012 at 3:41 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>
>> It should work. Same code is working fine for me. Try to create some
>> other directory in your Hdfs and use it as your output path. Also see if
>> you find something in datanode logs.
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>>
>> On Mon, Nov 19, 2012 at 9:04 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>>
>>> The input path is fine. Problem in output path. I am just wonder that it
>>> copy the data into local disk  (/user/root/) not into hdfs. I dont know
>>> why? Is it we give the correct statement to point to hdfs?
>>>
>>> Thanks
>>>
>>>
>>>
>>> On Mon, Nov 19, 2012 at 3:10 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>>>
>>>> Try this as your input file path
>>>> Path inputFile = new Path("file:///usr/Eclipse/Output.csv");
>>>>
>>>> Regards,
>>>>     Mohammad Tariq
>>>>
>>>>
>>>>
>>>> On Mon, Nov 19, 2012 at 8:31 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>>>>
>>>>> when I am applying the command as
>>>>>
>>>>> $ hadoop fs -put /usr/Eclipse/Output.csv /user/root/Output.csv.
>>>>>
>>>>> its work fine and file browsing in the hdfs. But i dont know why its
>>>>> not work in program.
>>>>>
>>>>> Many thanks for your cooperation.
>>>>>
>>>>> Best regards,
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Mon, Nov 19, 2012 at 2:53 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>>>>>
>>>>>> It would be good if I could have a look on the files. Meantime try
+
kashif khan 2012-11-20, 14:27
+
Mohammad Tariq 2012-11-20, 14:33
+
kashif khan 2012-11-20, 14:36
+
Mohammad Tariq 2012-11-20, 14:53
+
kashif khan 2012-11-20, 15:04
+
kashif khan 2012-11-20, 16:22
+
shekhar sharma 2012-11-20, 19:06
+
kashif khan 2012-11-21, 12:36
+
shekhar sharma 2012-11-26, 16:42
+
kashif khan 2012-11-27, 21:25
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB