Blargy 2010-06-29, 00:10
S. Venkatesh 2010-06-29, 04:34
Steve Loughran 2010-06-29, 09:31
Just use the hadoop client tools. That is the hadoop package and configure
it to point to your running cluster. You don't need to start any hadoop
processes on the node with your logs. Just use the command line (hadoop dfs
-put) or (hadoop distcp) to move the files from each application server
directly into your HDFS cluster.
On Mon, Jun 28, 2010 at 5:10 PM, Blargy <[EMAIL PROTECTED]> wrote:
> I am currently looking into importing all of our application log files
> host machines) into HDFS. Can someone point me in the right direction or
> walk me through the process of how I can accomplish this? Any good reading
> material on this subject? Videos?
> I hope I don't need to physically copy all of the log files to one target
> machine before importing.
> View this message in context:
> Sent from the Hadoop lucene-users mailing list archive at Nabble.com.