Skip to main content

The No Free Lunch Theorem: What Are its Main Implications for the Optimization Practice?

  • Chapter
  • First Online:
Black Box Optimization, Machine Learning, and No-Free Lunch Theorems

Part of the book series: Springer Optimization and Its Applications ((SOIA,volume 170))

  • 2386 Accesses

Abstract

The chapter considers the recent but already classic theoretical result called No Free Lunch Theorem in the context of optimization practice. The No Free Lunch Theorem is probably the fundamental theoretical result of the Machine Learning field but its practical meaning and implication for practitioners facing “real life” industrial and design optimization problems are rarely addressed in the technical literature. This discussion is intended for a broad audience of mathematicians, engineers, and computer scientists and presents a probabilistic understanding of the theorem that can shed light to its meaning and impact in the industrial optimization practice.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    With some abuse of language, in this chapter we will use the terms learning and inferring interchangeably.

  2. 2.

    In metaheuristics optimization the objective function is often referred to as fitness function or simply fitness.

  3. 3.

    Even if there is no explicit mention of constrains here the formulation is nonetheless enough general since they can be incorporated through an appropriate definition of the search space \(\mathcal {H}\).

  4. 4.

    Of course the sample must be properly generated. For Design of Experiments techniques see [10].

References

  1. Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1(1), 67–82 (1997)

    Article  Google Scholar 

  2. Schöffler, S.: Global Optimization A Stochastic Approach. Springer, New York (2012)

    Book  Google Scholar 

  3. Holland, J.H.: Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. The MIT Press, Cambridge (1992)

    Book  Google Scholar 

  4. Mitchell, M.: An Introduction to Genetic Algorithms MIT Press, Cambridge (1998)

    Google Scholar 

  5. Haupt, S.E., Haupt, R.L.: Practical Genetic Algorithms, vol. 100. Wiley, Hoboken (2004)

    MATH  Google Scholar 

  6. Simon, D.: Evolutionary Optimization Algorithms. Wiley, Hoboken (2013)

    Google Scholar 

  7. Luke, S.: Essentials of metaheuristics. https://cs.gmu.edu/~sean/book/metaheuristics/ (2009)

  8. Yang, X.S.: Review of metaheuristics and generalized evolutionary walk algorithm. J. Bio-Inspired Comput. 3(2), 77–84 (2011)

    Article  Google Scholar 

  9. Serafino, L.: Optimizing without derivatives: what does the no free lunch theorem actually says? Not. AMS 61, 750–755 (2014)

    MathSciNet  MATH  Google Scholar 

  10. Sudjianto, A., Fang, K.T., Li, R.: Design and Modeling for Computer Experiments. Computer Science & Data Analysis. Chapman & Hall/CRC, London (2005)

    Google Scholar 

  11. Weise, T.: Global Optimization Algorithms. Theory and Application (2011). http://www.it-weise.de/projects/bookNew.pdf

  12. Eiben, A.E., Hinterding, R., Michalewicz, Z.: Parameter control in evolutionary algorithms. IEEE Trans. Evol. Comput. 3, 124–141 (2000)

    Article  Google Scholar 

  13. Hansen, N.: The CMA evolution strategy: a comparing review. In: Towards a New Evolutionary Computation. Advances on Estimation of Distribution Algorithms, pp. 1769–1776. Springer, Berlin (2006)

    Google Scholar 

  14. Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft Comput. 9(1), 3–12 (2005)

    Article  Google Scholar 

  15. Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to Derivative-Free Optimization. Society for Industrial and Applied Mathematics, Philadelphia (2009)

    Book  Google Scholar 

  16. Brochu, E., Cora, V.M., de Freitas, N.: A tutorial on Bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning. CoRR abs/1012.2599 (2010)

    Google Scholar 

  17. Alpcan, T.: A framework for optimization under limited information. J. Global Optim. 55, 681–706 (2013)

    Article  MathSciNet  Google Scholar 

  18. Hoffman, M., Brochu, E., de Freitas, N.: Portfolio allocation for Bayesian optimization. UAI, arXiv:1009.5419v2 (2011)

    Google Scholar 

  19. Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, corrected edn. Springer, Berlin (2003)

    MATH  Google Scholar 

  20. Sørensen, K.: Metaheuristics–the metaphor exposed. Intl. Trans. Oper. Res. 22, 3–18 (2015)

    Article  MathSciNet  Google Scholar 

  21. Stefani, M.: Protein folding and misfolding on surfaces. Int. J. Mol. Sci. 9, 2515–2542 (2008)

    Article  Google Scholar 

  22. Weise, T., Zapf, M., Chiong, R., Nebro Urbaneja, A.J.: Why is optimization difficult? In: Chiong, R. (ed.), Nature-Inspired Algorithms for Optimisation, pp. 1–50. Springer, Berlin (2009)

    Google Scholar 

  23. Shan, S., Gary Wang, G.: Survey of modeling and optimization strategies to solve high-dimensional design problems with computationally-expensive black-box functions. Struct. Multidiscip. Optim. 41(2), 219–241 (2010)

    Article  MathSciNet  Google Scholar 

  24. Tennem, Y.: A computational intelligence algorithm for expensive engineering optimization problems. Eng. Appl. Artif. Intell. 25(5), 1009–1021 (2012)

    Article  Google Scholar 

  25. Serafino, L.: No free lunch theorem and Bayesian probability theory: two sides of the same coin. Some implications for black-box optimization and metaheuristics (2013). arXiv:cs.LG/1311.6041

    Google Scholar 

  26. Alander, J.T.: An indexed bibliography of genetic algorithms in materials science and engineering (2008). http://lipas.uwasa.fi/TAU/report94-1/gaMSEbib.pdf

  27. El-Mihoub, T.A., Nolle, L., Battersby, A., Hopgood, A.A.: Hybrid genetic algorithms: a review. Eng. Lett. 13(2), 103591 (2006)

    Google Scholar 

  28. Garcia-Martinez, C., Rodriguez, F.J., Lozano, M.: Arbitrary function optimisation with metaheuristics. Soft Comput. 16(12), 2115–2133 (2012)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Loris Serafino .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Serafino, L. (2021). The No Free Lunch Theorem: What Are its Main Implications for the Optimization Practice?. In: Pardalos, P.M., Rasskazova, V., Vrahatis, M.N. (eds) Black Box Optimization, Machine Learning, and No-Free Lunch Theorems. Springer Optimization and Its Applications, vol 170. Springer, Cham. https://doi.org/10.1007/978-3-030-66515-9_12

Download citation

Publish with us

Policies and ethics