----- Original Message ----
From: Amandeep Khurana <[EMAIL PROTECTED]>
To: [EMAIL PROTECTED]
Sent: Wednesday, March 11, 2009 9:46:09 PM
Subject: Re: How to read output files over HDFS
2 ways that I can think of:
1. Write another MR job without a reducer. The mapper can be made to do
whatever logic you want to do.
2. Take an instance of DistributedFileSystem class in your java code and use
it to read the file from HDFS.
Computer Science Graduate Student
University of California, Santa Cruz
On Wed, Mar 11, 2009 at 9:23 PM, Muhammad Arshad <[EMAIL PROTECTED]> wrote:
> I am running multiple MapReduce jobs which generate their output in
> directories named output0, output1, output2, ...etc. Once these jobs
> complete i want to read the output stored in these files(line by line) using
> a Java code automatically.
> Kindly tell me how i can do this.
> I do not want to use 'hadoop dfs -get ... ...' command to first bring the
> output files to local directory. I would be greatful if somebody can write
> me a snipped of code for doing this task.