Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> Counters across all jobs

Copy link to this message
Re: Counters across all jobs

Counters are per-job in Hadoop MapReduce. You need an external aggregator for such cross-job counters - for e.g. a node in Zookeeper.

Also, is it just for display or your job-logic depends on this? If it is the earlier, and if you don't have a problem with waiting till jobs finish, you can do a post-process on the counters of all jobs and calculate the aggregates.

+Vinod Kumar Vavilapalli
Hortonworks Inc.

On Aug 28, 2012, at 1:20 AM, Kasi Subrahmanyam wrote:

> Hi,
> I have around 4 jobs running in a controller.
> How can i have a single unique counter present in all the jobs and incremented where ever used in a job?
> For example:Consider a counter ACount.
> If job1 is incrementing the counter by2 and job3 by 5 and job 4 by 6.
> Can i have the  counter displayed output in the jobtracker as
> job1:2
> job2:2
> job3:7
> job4:13
> Thanks,
> Subbu