Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Flume, mail # dev - How I can use flume to automatically upload files into HDFS


+
kashif khan 2012-11-17, 18:33
Copy link to this message
-
Re: How I can use flume to automatically upload files into HDFS
Brock Noland 2012-11-17, 19:51
Hi,

This question is for the user@ list not the dev@ list.  Sounds like
you want the spool directory source which will be available in the 1.3
release. Another gentleman has shared his configuration here:

http://s.apache.org/8Ea

for that source.

Brock

On Sat, Nov 17, 2012 at 12:33 PM, kashif khan <[EMAIL PROTECTED]> wrote:
> HI,
>
> I am generating files continuously in local folder of my base machine. How
> I can now use the flume to stream the generated files from local folder to
> HDFS.
> I dont know how exactly configure the sources, sinks and hdfs.
>
> 1) location of folder where files are generating: /usr/datastorage/
> 2) name node address: htdfs://hadoop1.example.com:8020
>
> Please let me help.
>
> Many thanks
>
> Best regards,
> KK

--
Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/