-Re: Counters across all jobs
Vinod Kumar Vavilapalli 2012-09-10, 17:46
Counters are per-job in Hadoop MapReduce. You need an external aggregator for such cross-job counters - for e.g. a node in Zookeeper.
Also, is it just for display or your job-logic depends on this? If it is the earlier, and if you don't have a problem with waiting till jobs finish, you can do a post-process on the counters of all jobs and calculate the aggregates.
+Vinod Kumar Vavilapalli
On Aug 28, 2012, at 1:20 AM, Kasi Subrahmanyam wrote:
> I have around 4 jobs running in a controller.
> How can i have a single unique counter present in all the jobs and incremented where ever used in a job?
> For example:Consider a counter ACount.
> If job1 is incrementing the counter by2 and job3 by 5 and job 4 by 6.
> Can i have the counter displayed output in the jobtracker as