Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume, mail # user - Automatically upload files into HDFS


Copy link to this message
-
Re: Automatically upload files into HDFS
kashif khan 2012-11-19, 14:29
Yeah, My cluster running. When brows http://hadoop1.example.com:
50070/dfshealth.jsp. I am getting the main page. Then click on Brows file
system. I am getting the following:

hbase
tmp
user

And when click on user getting:

beeswax
huuser (I have created)
root (I have created)

Would you like to see my configuration file. As did not change any things,
all by default. I have installed CDH4.1 and running on VMs.

Many thanks
On Mon, Nov 19, 2012 at 2:04 PM, Mohammad Tariq <[EMAIL PROTECTED]> wrote:

> Is your cluster running fine? Are you able to browse Hdfs through the Hdfs
> Web Console at 50070?
>
> Regards,
>     Mohammad Tariq
>
>
>
> On Mon, Nov 19, 2012 at 7:31 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>
>> Many thanks.
>>
>> I have changed the program accordingly. It does not show any error but
>> one warring , but when I am browsing the HDFS folder, file is not copied.
>>
>>
>> public class CopyData {
>> public static void main(String[] args) throws IOException{
>>         Configuration conf = new Configuration();
>>         //Configuration configuration = new Configuration();
>>         //configuration.addResource(new
>> Path("/home/mohammad/hadoop-0.20.205/conf/core-site.xml"));
>>         //configuration.addResource(new
>> Path("/home/mohammad/hadoop-0.20.205/conf/hdfs-site.xml"));
>>
>>         conf.addResource(new Path("/etc/hadoop/conf/core-site.xml"));
>>         conf.addResource(new Path ("/etc/hadoop/conf/hdfs-site.xml"));
>>          FileSystem fs = FileSystem.get(conf);
>>         Path inputFile = new Path("/usr/Eclipse/Output.csv");
>>         Path outputFile = new Path("/user/hduser/Output1.csv");
>>         fs.copyFromLocalFile(inputFile, outputFile);
>>         fs.close();
>>     }
>> }
>>
>> 19-Nov-2012 13:50:32 org.apache.hadoop.util.NativeCodeLoader <clinit>
>> WARNING: Unable to load native-hadoop library for your platform... using
>> builtin-java classes where applicable
>>
>> Have any idea?
>>
>> Many thanks
>>
>>
>>
>>
>>
>>
>> On Mon, Nov 19, 2012 at 1:18 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>>
>>> If it is just copying the files without any processing or change, you
>>> can use something like this :
>>>
>>> public class CopyData {
>>>
>>>     public static void main(String[] args) throws IOException{
>>>
>>>         Configuration configuration = new Configuration();
>>>         configuration.addResource(new
>>> Path("/home/mohammad/hadoop-0.20.205/conf/core-site.xml"));
>>>         configuration.addResource(new
>>> Path("/home/mohammad/hadoop-0.20.205/conf/hdfs-site.xml"));
>>>         FileSystem fs = FileSystem.get(configuration);
>>>         Path inputFile = new Path("/home/mohammad/pc/work/FFT.java");
>>>         Path outputFile = new Path("/mapout/FFT.java");
>>>         fs.copyFromLocalFile(inputFile, outputFile);
>>>         fs.close();
>>>     }
>>> }
>>>
>>> Obviously you have to modify it as per your requirements like
>>> continuously polling the targeted directory for new files.
>>>
>>> Regards,
>>>     Mohammad Tariq
>>>
>>>
>>>
>>> On Mon, Nov 19, 2012 at 6:23 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>>>
>>>> Thanks M  Tariq
>>>>
>>>> As I am new in  Java and Hadoop and have no much experience. I am
>>>> trying to first write a simple program to upload data into HDFS and
>>>> gradually move forward. I have written the following simple program to
>>>> upload the file into HDFS, I dont know why it does not working.  could you
>>>> please check it, if have time.
>>>>
>>>> import java.io.BufferedInputStream;
>>>> import java.io.BufferedOutputStream;
>>>> import java.io.File;
>>>> import java.io.FileInputStream;
>>>> import java.io.FileOutputStream;
>>>> import java.io.IOException;
>>>> import java.io.InputStream;
>>>> import java.io.OutputStream;
>>>> import java.nio.*;
>>>> //import java.nio.file.Path;
>>>>
>>>> import org.apache.hadoop.conf.Configuration;
>>>> import org.apache.hadoop.fs.FSDataInputStream;
>>>> import org.apache.hadoop.fs.FSDataOutputStream;
>>