Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> How to share Same Counter in Multiple Jobs?


Copy link to this message
-
Re: How to share Same Counter in Multiple Jobs?
I wrote the following code today. We have our own flow execution logic which
calls the following to collect counters.

    enum COUNT_COLLECTION {
        LOG,            // log the counters
        ADD_TO_CONF        // add counters to JobConf
    }

    protected static void collectCounters(RunningJob running, JobConf
jobConf, EnumSet<COUNT_COLLECTION> collFlags) {
        try {
            Counters counters = running.getCounters();
            Collection<String> counterGroupNames = counters.getGroupNames();
            if (counterGroupNames == null) {
                LOG.warn("No counters returned from job " +
running.getJobName());
            } else {
                String[] groupsToCollect = { "Map-Reduce Framework",
"FileSystemCounters" };
                for (String counterGroupName : groupsToCollect) {
                    for (Iterator<Counter> iterator counters.getGroup(counterGroupName).iterator(); iterator.hasNext();) {
                        Counter counter = iterator.next();
                        String counterName counters.getGroup(counterGroupName).getDisplayName()+"."+
                        counter.getDisplayName();
                        if (collFlags.contains(COUNT_COLLECTION.LOG)) {
                            LOG.info(counterName + ": " +
counter.getCounter());
                        }
                    }
                }
            }
        } catch (IOException e) {
            LOG.error("unable to retrieve counters", e);
        }
    }

You can pass the counter from Job 1 to Job 3 via JobConf.

On Thu, Dec 9, 2010 at 9:45 PM, Savannah Beckett <
[EMAIL PROTECTED]> wrote:

> Hi,
>   I chain multiple jobs in my program.  Job 1's reduce function has a
> counter.  I want job 3's reduce function to read this Job 1's counter.
> How?
> Thanks.
>
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB