Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # dev >> How to change the scheduler


Copy link to this message
-
Re: How to change the scheduler
If you will change the default scheduler take a look at the thread "Dynamic
changing of slaves" on the mailing list where I described my understanding
of the scheduling process. However if you will modify some fairscheduling
code, take a look at classes fairscheduler.java and
schedulingalgorithms.java from the fairscheduler package. Here you can make
your own comparator that can compare jobs/pools depending on varying
parameters or you can make your own method for job/pool share calculation.
If I am not wrong there is another option just to extend a class xy (do not
know which) and set jobs weights (check the docs from Harsh)
On 7 March 2012 01:08, Ranjan Banerjee <[EMAIL PROTECTED]> wrote:

> Hello Harsh,
>    Thanks for the quick info. I will go through the specifics. I hope you
> will address any further doubts that I have on the issue.
>
> Regards,
> Ranjan
>
> On 03/06/12, Harsh J   wrote:
> > Ranjan,
> >
> > Schedulers do not apply per-job. You need to change it at the JobTracker.
> >
> > Follow instructions at
> > http://hadoop.apache.org/common/docs/r1.0.0/fair_scheduler.html to
> > switch scheduler to FairScheduler.
> >
> > On Wed, Mar 7, 2012 at 4:08 AM, Ranjan Banerjee <[EMAIL PROTECTED]>
> wrote:
> > >
> > > Hello,
> > >    I am relatively new to Hadoop. Started playing with it around two
> weeks ago. I have finished running the canonical word count map reduce
> example. My class project involves coming with a different scheduler for
> Hadoop. I know that by default Hadoop uses the FIFO and it also has the
> fair scheduler to be used. Can someone suggest where do I exactly write my
> scheduler code and what change do I need to do to the Conf object so that
> my scheduler is used by Hadoop and not the default scheduler in scheduling
> map reduce jobs.
> > >
> > > Regards,
> > > Ranjan
> > >
> > >
> > >
> > >
> > > On 03/02/12, mohammed elsaeedy   wrote:
> > >> Dear Mailing list,
> > >>
> > >>    I've been trying to simply build the hadoop source code on my
> Macbook
> > >> Pro. I tried following the tutorial mentioned
> > >> here<http://wiki.apache.org/hadoop/EclipseEnvironment>.
> > >> I grabbed the source code from github, and then I try  to run maven:
> > >>
> > >> mvn install -DskipTests
> > >>
> > >> mvn clean package -DskipTests -Pdist -Dtar -Dmaven.javadoc.skip=true
> > >>
> > >>
> > >> but I always get the following error:
> > >>
> > >> *[INFO]
> > >>
> ------------------------------------------------------------------------*
> > >> *[INFO] Total time: 14:48.283s*
> > >> *[INFO] Finished at: Fri Mar 02 13:29:20 EET 2012*
> > >> *[INFO] Final Memory: 68M/123M*
> > >> *[INFO]
> > >>
> ------------------------------------------------------------------------*
> > >> *[ERROR] Failed to execute goal
> > >> org.apache.maven.plugins:maven-pdf-plugin:1.1:pdf (pdf) on project
> > >> hadoop-distcp: Error during document generation: Error parsing
> > >>
> /Users/SaSa/Desktop/Hadoop/src/hadoop-common/hadoop-tools/hadoop-distcp/target/pdf/site.tmp/xdoc/index.xml:
> > >> Error validating the model: Fatal error:*
> > >> *[ERROR] Public ID: null*
> > >> *[ERROR] System ID: http://maven.apache.org/xsd/xdoc-2.0.xsd*
> > >> *[ERROR] Line number: 2699*
> > >> *[ERROR] Column number: 5*
> > >> *[ERROR] Message: The element type "xs:element" must be terminated by
> the
> > >> matching end-tag "</xs:element>".*
> > >> *[ERROR] -> [Help 1]*
> > >> *[ERROR] *
> > >> *[ERROR] To see the full stack trace of the errors, re-run Maven with
> the
> > >> -e switch.*
> > >> *[ERROR] Re-run Maven using the -X switch to enable full debug
> logging.*
> > >> *[ERROR] *
> > >> *[ERROR] For more information about the errors and possible solutions,
> > >> please read the following articles:*
> > >> *[ERROR] [Help 1]
> > >>
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException*
> > >> *[ERROR] *
> > >> *[ERROR] After correcting the problems, you can resume the build with
> the
> > >> command*
> > >> *[ERROR]   mvn <goals> -rf :hadoop-distcp*