Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop, mail # user - How to read output files over HDFS


Copy link to this message
-
Re: How to read output files over HDFS
lohit 2009-03-12, 05:48

http://wiki.apache.org/hadoop/HadoopDfsReadWriteExample

Loht

----- Original Message ----
From: Amandeep Khurana <[EMAIL PROTECTED]>
To: [EMAIL PROTECTED]
Sent: Wednesday, March 11, 2009 9:46:09 PM
Subject: Re: How to read output files over HDFS

2 ways that I can think of:
1. Write another MR job without a reducer. The mapper can be made to do
whatever logic you want to do.
OR
2. Take an instance of DistributedFileSystem class in your java code and use
it to read the file from HDFS.
Amandeep Khurana
Computer Science Graduate Student
University of California, Santa Cruz
On Wed, Mar 11, 2009 at 9:23 PM, Muhammad Arshad <[EMAIL PROTECTED]> wrote:

> Hi,
>
> I am running multiple MapReduce jobs which generate their output in
> directories named output0, output1, output2, ...etc. Once these jobs
> complete i want to read the output stored in these files(line by line) using
> a Java code automatically.
>
> Kindly tell me how i can do this.
>
> I do not want to use 'hadoop dfs -get ... ...' command to first bring the
> output files to local directory. I would be greatful if somebody can write
> me a snipped of code for doing this task.
>
> thanks,
> --umer
>
>
>
>