-Re: How to copy log files from remote windows machine to Hadoop cluster
Yes. It is possible. I haven't tries windows+flume+hadoop combo
personally, but it should work. You may find this
has explained beautifully how to run Flume on a windows box.If I
get time i'll try to simulate your use case and let you know.
BTW, could you please share with us whatever you have tried??
On Thu, Jan 17, 2013 at 4:09 PM, Mahesh Balija
> I have studied Flume but I didn't find any thing useful in my case.
> My requirement is there is a directory in Windows machine, in which the
> files will be generated and keep updated with new logs. I want to have a
> tail kind of mechanism (using exec source) through which I can push the
> latest updates into the cluster.
> Or I have to simply push once in a day to the cluster using spooling
> directory mechanism.
> Can somebody assist whether it is possible using Flume if so the
> configurations needed for this specific to remote windows machine.
> On Thu, Jan 17, 2013 at 3:48 PM, Mirko Kämpf <[EMAIL PROTECTED]>wrote:
>> Give Flume (http://flume.apache.org/) a chance to collect your data.
>> 2013/1/17 sirenfei <[EMAIL PROTECTED]>
>>> ftp auto upload?
>>> 2013/1/17 Mahesh Balija <[EMAIL PROTECTED]>:
>>> > the Hadoop cluster (HDFS) either in synchronous or asynchronou