Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Flume >> mail # user >> Automatically upload files into HDFS


+
kashif khan 2012-11-19, 10:44
+
Mohammad Tariq 2012-11-19, 10:50
+
kashif khan 2012-11-19, 12:30
+
Mohammad Tariq 2012-11-19, 12:34
+
Mohammad Tariq 2012-11-19, 12:35
+
Alexander Alten-Lorenz 2012-11-19, 12:26
+
kashif khan 2012-11-19, 12:35
+
Mohammad Tariq 2012-11-19, 12:41
+
kashif khan 2012-11-19, 12:53
+
Mohammad Tariq 2012-11-19, 13:18
+
kashif khan 2012-11-19, 14:01
Copy link to this message
-
Re: Automatically upload files into HDFS
Is your cluster running fine? Are you able to browse Hdfs through the Hdfs
Web Console at 50070?

Regards,
    Mohammad Tariq

On Mon, Nov 19, 2012 at 7:31 PM, kashif khan <[EMAIL PROTECTED]> wrote:

> Many thanks.
>
> I have changed the program accordingly. It does not show any error but one
> warring , but when I am browsing the HDFS folder, file is not copied.
>
>
> public class CopyData {
> public static void main(String[] args) throws IOException{
>         Configuration conf = new Configuration();
>         //Configuration configuration = new Configuration();
>         //configuration.addResource(new
> Path("/home/mohammad/hadoop-0.20.205/conf/core-site.xml"));
>         //configuration.addResource(new
> Path("/home/mohammad/hadoop-0.20.205/conf/hdfs-site.xml"));
>
>         conf.addResource(new Path("/etc/hadoop/conf/core-site.xml"));
>         conf.addResource(new Path ("/etc/hadoop/conf/hdfs-site.xml"));
>          FileSystem fs = FileSystem.get(conf);
>         Path inputFile = new Path("/usr/Eclipse/Output.csv");
>         Path outputFile = new Path("/user/hduser/Output1.csv");
>         fs.copyFromLocalFile(inputFile, outputFile);
>         fs.close();
>     }
> }
>
> 19-Nov-2012 13:50:32 org.apache.hadoop.util.NativeCodeLoader <clinit>
> WARNING: Unable to load native-hadoop library for your platform... using
> builtin-java classes where applicable
>
> Have any idea?
>
> Many thanks
>
>
>
>
>
>
> On Mon, Nov 19, 2012 at 1:18 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>
>> If it is just copying the files without any processing or change, you can
>> use something like this :
>>
>> public class CopyData {
>>
>>     public static void main(String[] args) throws IOException{
>>
>>         Configuration configuration = new Configuration();
>>         configuration.addResource(new
>> Path("/home/mohammad/hadoop-0.20.205/conf/core-site.xml"));
>>         configuration.addResource(new
>> Path("/home/mohammad/hadoop-0.20.205/conf/hdfs-site.xml"));
>>         FileSystem fs = FileSystem.get(configuration);
>>         Path inputFile = new Path("/home/mohammad/pc/work/FFT.java");
>>         Path outputFile = new Path("/mapout/FFT.java");
>>         fs.copyFromLocalFile(inputFile, outputFile);
>>         fs.close();
>>     }
>> }
>>
>> Obviously you have to modify it as per your requirements like
>> continuously polling the targeted directory for new files.
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>>
>> On Mon, Nov 19, 2012 at 6:23 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>>
>>> Thanks M  Tariq
>>>
>>> As I am new in  Java and Hadoop and have no much experience. I am trying
>>> to first write a simple program to upload data into HDFS and gradually move
>>> forward. I have written the following simple program to upload the file
>>> into HDFS, I dont know why it does not working.  could you please check it,
>>> if have time.
>>>
>>> import java.io.BufferedInputStream;
>>> import java.io.BufferedOutputStream;
>>> import java.io.File;
>>> import java.io.FileInputStream;
>>> import java.io.FileOutputStream;
>>> import java.io.IOException;
>>> import java.io.InputStream;
>>> import java.io.OutputStream;
>>> import java.nio.*;
>>> //import java.nio.file.Path;
>>>
>>> import org.apache.hadoop.conf.Configuration;
>>> import org.apache.hadoop.fs.FSDataInputStream;
>>> import org.apache.hadoop.fs.FSDataOutputStream;
>>> import org.apache.hadoop.fs.FileSystem;
>>> import org.apache.hadoop.fs.Path;
>>> public class hdfsdata {
>>>
>>>
>>> public static void main(String [] args) throws IOException
>>> {
>>>     try{
>>>
>>>
>>>     Configuration conf = new Configuration();
>>>     conf.addResource(new Path("/etc/hadoop/conf/core-site.xml"));
>>>     conf.addResource(new Path ("/etc/hadoop/conf/hdfs-site.xml"));
>>>     FileSystem fileSystem = FileSystem.get(conf);
>>>     String source = "/usr/Eclipse/Output.csv";
>>>     String dest = "/user/hduser/input/";
>>>
>>>     //String fileName = source.substring(source.lastIndexOf('/') +
+
kashif khan 2012-11-19, 14:29
+
kashif khan 2012-11-19, 14:43
+
Mohammad Tariq 2012-11-19, 14:53
+
kashif khan 2012-11-19, 15:01
+
Mohammad Tariq 2012-11-19, 15:10
+
kashif khan 2012-11-19, 15:34
+
Mohammad Tariq 2012-11-19, 15:41
+
kashif khan 2012-11-20, 10:40
+
Mohammad Tariq 2012-11-20, 14:19
+
kashif khan 2012-11-20, 14:27
+
Mohammad Tariq 2012-11-20, 14:33
+
kashif khan 2012-11-20, 14:36
+
Mohammad Tariq 2012-11-20, 14:53
+
kashif khan 2012-11-20, 15:04
+
kashif khan 2012-11-20, 16:22
+
shekhar sharma 2012-11-20, 19:06
+
kashif khan 2012-11-21, 12:36
+
shekhar sharma 2012-11-26, 16:42
+
kashif khan 2012-11-27, 21:25
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB