Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HDFS >> mail # user >> Re: How to copy log files from remote windows machine to Hadoop cluster


Copy link to this message
-
Re: How to copy log files from remote windows machine to Hadoop cluster
I have studied Flume but I didn't find any thing useful in my case.
My requirement is there is a directory in Windows machine, in which the
files will be generated and keep updated with new logs. I want to have a
tail kind of mechanism (using exec source) through which I can push the
latest updates into the cluster.
Or I have to simply push once in a day to the cluster using spooling
directory mechanism.

Can somebody assist whether it is possible using Flume if so the
configurations needed for this specific to remote windows machine.

But

On Thu, Jan 17, 2013 at 3:48 PM, Mirko Kämpf <[EMAIL PROTECTED]> wrote:

> Give Flume (http://flume.apache.org/) a chance to collect your data.
>
> Mirko
>
>
>
> 2013/1/17 sirenfei <[EMAIL PROTECTED]>
>
>> ftp auto upload?
>>
>>
>> 2013/1/17 Mahesh Balija <[EMAIL PROTECTED]>:
>> > the Hadoop cluster (HDFS) either in synchronous or asynchronou
>>
>
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB