Patai Sangbutsarakum 2013-02-28, 06:57
Harsh J 2013-02-28, 07:06
Hi,Harsh and Patai,
I also have some performance related question based on Patai's. Could
Anyone help to give some hint.
1. When running a TeraSort on a cluster, I found that the shuffle phase
takes almost half of the total reduce runtime. Is the copy from the
mapoutput to reducer takes almost all of the shuffle phase time?
2. Does each Reducer get a continuous part of the mapoutput file, when
there are more than one reducer ? From the source code, the
ReduceTask.java will start a number of copy thread (mostly 5 threads), each
one will issue a http get operation to the corresponding taskTracker which
run the map task. And in the doGet method of the TaskTracker.java. The
TaskTrack will read the mapoutput file after looking into the index file of
the mapoutput file for an offset.
3. Have anyone done any performance analysis on the HTTP Copy framework?
On Thu, Feb 28, 2013 at 3:06 PM, Harsh J <[EMAIL PROTECTED]> wrote:
> The latter (from other machines, inbound to where the reduce is
> running, onto the reduce's local disk, via mapred.local.dir). The
> reduce will, obviously, copy outputs from all maps that may have
> produced data for its assigned partition ID.
> On Thu, Feb 28, 2013 at 12:27 PM, Patai Sangbutsarakum
> <[EMAIL PROTECTED]> wrote:
> > Good evening Hadoopers!
> > at the jobtracker page, click on a job, and click at running reduce
> > task, I am going to see
> > task_201302271736_0638_r_000000 reduce > copy (136 of 261 at 0.44 MB/s)
> > I am really curious where is the data is being copy.
> > if i clicked at the task, it will show a host that is running the task
> > question is "reduce > copy" is referring data copy outbound from host
> > that is running task attempt, or
> > referring to data is being copy from other machines inbound to this
> > host (that's running task attempt)
> > and in both cases how do i know what machines that host is copy data
> > Regards,
> > Patai
> Harsh J
Patai Sangbutsarakum 2013-02-28, 07:23
Harsh J 2013-03-20, 07:43