Advertisement

A Meta-learning Prediction Model of Algorithm Performance for Continuous Optimization Problems

  • Mario A. Muñoz
  • Michael Kirley
  • Saman K. Halgamuge
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7491)

Abstract

Algorithm selection and configuration is a challenging problem in the continuous optimization domain. An approach to tackle this problem is to develop a model that links landscape analysis measures and algorithm parameters to performance. This model can be then used to predict algorithm performance when a new optimization problem is presented. In this paper, we investigate the use of a machine learning framework to build such a model. We demonstrate the effectiveness of our technique using CMA-ES as a representative algorithm and a feed-forward backpropagation neural network as the learning strategy. Our experimental results show that we can build sufficiently accurate predictions of an algorithm’s expected performance. This information is used to rank the algorithm parameter settings based on the current problem instance, hence increasing the probability of selecting the best configuration for a new problem.

Keywords

Automatic analysis of algorithms algorithm configuration heuristic methods randomized algorithms meta-learning models 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Hutter, F., Hamadi, Y., Hoos, H.H., Leyton-Brown, K.: Performance Prediction and Automated Tuning of Randomized and Parametric Algorithms. In: Benhamou, F. (ed.) CP 2006. LNCS, vol. 4204, pp. 213–228. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  2. 2.
    Wolpert, D., Macready, W.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1(1), 67–82 (1997)CrossRefGoogle Scholar
  3. 3.
    Leyton-Brown, K., Nudelman, E., Shoham, Y.: Empirical hardness models: Methodology and a case study on combinatorial auctions. J. ACM 56, 22:1–22:52 (2009)Google Scholar
  4. 4.
    Rice, J.: The algorithm selection problem. In: Advances in Computers, vol.15, pp. 65–118. Elsevier (1976)Google Scholar
  5. 5.
    Smith-Miles, K.A., James, R.J.W., Giffin, J.W., Tu, Y.: A Knowledge Discovery Approach to Understanding Relationships between Scheduling Problem Structure and Heuristic Performance. In: Stützle, T. (ed.) LION 3. LNCS, vol. 5851, pp. 89–103. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  6. 6.
    Hoos, H.H.: Programming by optimization. Commun. ACM 55(2), 70–80 (2012)CrossRefGoogle Scholar
  7. 7.
    Hansen, N., Auger, A., Finck, S., Ros, R.: Real-parameter black-box optimization benchmarking BBOB-2010: Experimental setup. Technical Report RR-7215, INRIA (September 2010)Google Scholar
  8. 8.
    Suganthan, P., Hansen, N., Liang, J., Deb, K., Chen, Y., Auger, A., Tiwari, S.: Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization. Technical report, NTU, Singapore and IIT, Kanpur (2005)Google Scholar
  9. 9.
    Reeves, C.: Fitness landscapes. In: Search Methodologies, pp. 587–610. Springer (2005)Google Scholar
  10. 10.
    Hough, P., Williams, P.: Modern machine learning for automatic optimization algorithm selection. In: Proceedings of the INFORMS Artificial Intelligence and Data Mining Workshop (2006)Google Scholar
  11. 11.
    He, J., Reeves, C., Witt, C., Yao, X.: A note on problem difficulty measures in black-box optimization: Classification, realizations and predictability. Evol. Comput. 15(4), 435–443 (2007)CrossRefGoogle Scholar
  12. 12.
    Smith-Miles, K.: Cross-disciplinary perspectives on meta-learning for algorithm selection. ACM Comput. Surv. 41(1), 6:1–6:25 (2009)Google Scholar
  13. 13.
    Francois, O., Lavergne, C.: Design of evolutionary algorithms-a statistical perspective. IEEE Trans. Evol. Comput. 5(2), 129–148 (2001)CrossRefGoogle Scholar
  14. 14.
    Steer, K.C.B., Wirth, A., Halgamuge, S.K.: Information Theoretic Classification of Problems for Metaheuristics. In: Li, X., Kirley, M., Zhang, M., Green, D., Ciesielski, V., Abbass, H.A., Michalewicz, Z., Hendtlass, T., Deb, K., Tan, K.C., Branke, J., Shi, Y. (eds.) SEAL 2008. LNCS, vol. 5361, pp. 319–328. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  15. 15.
    Malan, K., Engelbrecht, A.: Quantifying ruggedness of continuous landscapes using entropy. In: Proceedings of the 2009 IEEE Congress on Evolutionary Computation (CEC 2009), pp. 1440–1447 (May 2009)Google Scholar
  16. 16.
    Caamaño, P., Prieto, A., Becerra, J.A., Bellas, F., Duro, R.J.: Real-Valued Multimodal Fitness Landscape Characterization for Evolution. In: Wong, K.W., Mendis, B.S.U., Bouzerdoum, A. (eds.) ICONIP 2010, Part I. LNCS, vol. 6443, pp. 567–574. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  17. 17.
    Mersmann, O., Preuss, M., Trautmann, H.: Benchmarking Evolutionary Algorithms: Towards Exploratory Landscape Analysis. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN XI. LNCS, vol. 6238, pp. 73–82. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  18. 18.
    Müller, C.L., Sbalzarini, I.F.: Global Characterization of the CEC 2005 Fitness Landscapes Using Fitness-Distance Analysis. In: Di Chio, C., Cagnoni, S., Cotta, C., Ebner, M., Ekárt, A., Esparcia-Alcázar, A.I., Merelo, J.J., Neri, F., Preuss, M., Richter, H., Togelius, J., Yannakakis, G.N. (eds.) EvoApplications 2011, Part I. LNCS, vol. 6624, pp. 294–303. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  19. 19.
    Richter, H.: Coupled map lattices as spatio-temporal fitness functions: Landscape measures and evolutionary optimization. Phys. Nonlinear Phenom. 237(2), 167–186 (2008)zbMATHCrossRefGoogle Scholar
  20. 20.
    Watson, J., Howe, A.: Focusing on the individual: Why we need new empirical methods for characterizing problem difficulty. In: Working Notes of ECAI 2000 Workshop on Empirical Methods in Artificial Intelligence (August 2000)Google Scholar
  21. 21.
    Beck, J., Watson, J.: Adaptive search algorithms and fitness-distance correlation. In: Proceedings of the Fifth Metaheuristics International Conference (2003)Google Scholar
  22. 22.
    Smith, T., Husbands, P., Layzell, P., O’Shea, M.: Fitness landscapes and evolvability. Evol. Comput. 10(1), 1–34 (2002)CrossRefGoogle Scholar
  23. 23.
    Lunacek, M., Whitley, D.: The dispersion metric and the CMA evolution strategy. In: Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation, pp. 477–484. ACM, New York (2006)CrossRefGoogle Scholar
  24. 24.
    Jones, T., Forrest, S.: Fitness distance correlation as a measure of problem difficulty for genetic algorithms. In: Proceedings of the Sixth International Conference on Genetic Algorithms, pp. 184–192. Morgan Kaufmann Publishers Inc. (1995)Google Scholar
  25. 25.
    Seo, D., Moon, B.: An information-theoretic analysis on the interactions of variables in combinatorial optimization problems. Evol. Comput. 15(2), 169–198 (2007)CrossRefGoogle Scholar
  26. 26.
    Rice, J.: Methodology for the algorithm selection problem. In: Proceedings of the IFIP TC 2.5 Working Conference on Performance Evaluation of Numerical Software (1979)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Mario A. Muñoz
    • 1
  • Michael Kirley
    • 2
  • Saman K. Halgamuge
    • 1
  1. 1.Department of Mechanical EngineeringThe University of MelbourneParkvilleAustralia
  2. 2.Department of Computing and Information SystemsThe University of MelbourneParkvilleAustralia

Personalised recommendations