Ordinal Regression in Evolutionary Computation

  • Thomas Philip Runarsson
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4193)


Surrogate ranking in evolutionary computation using ordinal regression is introduced. The fitness of individual points is indirectly estimated by modeling their rank. The aim is to reduce the number of costly fitness evaluations needed for evolution. The ordinal regression, or preference learning, implements a kernel-defined feature space and an optimization technique by which the margin between rank boundaries is maximized. The technique is illustrated on some classical numerical optimization functions using an evolution strategy. The benefits of surrogate ranking, compared to surrogates that model the fitness function directly, are discussed.


Evolutionary Computation Polynomial Kernel Ordinal Regression Search Point Training Point 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Ong, Y., Nair, P., Keane, A., Wong, K.W.: 15. Studies in Fuzziness and Soft Computing Series. In: Surrogate-Assisted Evolutionary Optimization Frameworks for High-Fidelity Engineering Design Problems, pp. 333–358. Springer, Heidelberg (2004)Google Scholar
  2. 2.
    Sobester, A., Leary, S., Keane, A.: On the design of optimization strategies based on global response surface approximation models. Journal of Global Optimization 33(1), 31–59 (2005)MATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft Computing - A Fusion of Foundations, Methodologies and Applications 9(1), 3–12 (2005)Google Scholar
  4. 4.
    Jin, Y., Olhofer, M., Sendhoff, B.: A framework for evolutionary optimization with approximate fitness functions. IEEE Transactions on Evolutionary Computation 6(5) (2002)Google Scholar
  5. 5.
    Bandler, J., Cheng, Q., Dakroury, S., Mohamed, A., Bakr, M., Madsen, K., Sondergaard, J.: Space mapping: The state of the art. IEEE Transactions on Microwave Theory and Techniques 52(1) (2004)Google Scholar
  6. 6.
    Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation 9(2), 159–195 (2001)CrossRefGoogle Scholar
  7. 7.
    Herbrich, R., Graepel, T., Obermayer, K.: Large margin rank boundaries for ordinal regression. Advances in Large Margin Classifiers, 115–132 (2000)Google Scholar
  8. 8.
    Joachims, T.: Optimizing search engines using clickthrough data. In: Proceedings of the ACM Conference on Knowledge Discovery and Data Mining (KDD), ACM, New York (2002)Google Scholar
  9. 9.
    Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, Cambridge (2004)Google Scholar
  10. 10.
    Christianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines. Cambridge University Press, Cambridge (2002)Google Scholar
  11. 11.
    Runarsson, T.P.: Constrained evolutionary optimization by approximate ranking and surrogate models. In: Parallel Problem Solving from Nature VII (PPSN 2004). LNCS, vol. 3242, pp. 401–410. Springer, Heidelberg (2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Thomas Philip Runarsson
    • 1
  1. 1.Science InstituteUniversity of Iceland 

Personalised recommendations