Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume >> mail # dev >> How I can use flume to automatically upload files into HDFS


Copy link to this message
-
Re: How I can use flume to automatically upload files into HDFS
Hi,

This question is for the user@ list not the dev@ list.  Sounds like
you want the spool directory source which will be available in the 1.3
release. Another gentleman has shared his configuration here:

http://s.apache.org/8Ea

for that source.

Brock

On Sat, Nov 17, 2012 at 12:33 PM, kashif khan <[EMAIL PROTECTED]> wrote:
> HI,
>
> I am generating files continuously in local folder of my base machine. How
> I can now use the flume to stream the generated files from local folder to
> HDFS.
> I dont know how exactly configure the sources, sinks and hdfs.
>
> 1) location of folder where files are generating: /usr/datastorage/
> 2) name node address: htdfs://hadoop1.example.com:8020
>
> Please let me help.
>
> Many thanks
>
> Best regards,
> KK

--
Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB