Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume >> mail # user >> Automatically upload files into HDFS


Copy link to this message
-
Re: Automatically upload files into HDFS
Well, I want to automatically upload the files as  the files are generating
about every 3-5 sec and each file has size about 3MB.

 Is it possible to automate the system using put or cp command?

I read about the flume and webHDFS but I am not sure it will work or not.

Many thanks

Best regards

On Mon, Nov 19, 2012 at 12:26 PM, Alexander Alten-Lorenz <
[EMAIL PROTECTED]> wrote:

> Hi,
>
> Why do you don't use HDFS related tools like put or cp?
>
> - Alex
>
> On Nov 19, 2012, at 11:44 AM, kashif khan <[EMAIL PROTECTED]> wrote:
>
> > HI,
> >
> > I am generating files continuously in local folder of my base machine.
> How
> > I can now use the flume to stream the generated files from local folder
> to
> > HDFS.
> > I dont know how exactly configure the sources, sinks and hdfs.
> >
> > 1) location of folder where files are generating: /usr/datastorage/
> > 2) name node address: htdfs://hadoop1.example.com:8020
> >
> > Please let me help.
> >
> > Many thanks
> >
> > Best regards,
> > KK
>
> --
> Alexander Alten-Lorenz
> http://mapredit.blogspot.com
> German Hadoop LinkedIn Group: http://goo.gl/N8pCF
>
>