-Re: How I can use flume to automatically upload files into HDFS
Brock Noland 2012-11-17, 19:51
This question is for the user@ list not the dev@ list. Sounds like
you want the spool directory source which will be available in the 1.3
release. Another gentleman has shared his configuration here:
for that source.
On Sat, Nov 17, 2012 at 12:33 PM, kashif khan <[EMAIL PROTECTED]> wrote:
> I am generating files continuously in local folder of my base machine. How
> I can now use the flume to stream the generated files from local folder to
> I dont know how exactly configure the sources, sinks and hdfs.
> 1) location of folder where files are generating: /usr/datastorage/
> 2) name node address: htdfs://hadoop1.example.com:8020
> Please let me help.
> Many thanks
> Best regards,
Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/