Abstract
The chapter considers the recent but already classic theoretical result called No Free Lunch Theorem in the context of optimization practice. The No Free Lunch Theorem is probably the fundamental theoretical result of the Machine Learning field but its practical meaning and implication for practitioners facing “real life” industrial and design optimization problems are rarely addressed in the technical literature. This discussion is intended for a broad audience of mathematicians, engineers, and computer scientists and presents a probabilistic understanding of the theorem that can shed light to its meaning and impact in the industrial optimization practice.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
With some abuse of language, in this chapter we will use the terms learning and inferring interchangeably.
- 2.
In metaheuristics optimization the objective function is often referred to as fitness function or simply fitness.
- 3.
Even if there is no explicit mention of constrains here the formulation is nonetheless enough general since they can be incorporated through an appropriate definition of the search space \(\mathcal {H}\).
- 4.
Of course the sample must be properly generated. For Design of Experiments techniques see [10].
References
Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1(1), 67–82 (1997)
Schöffler, S.: Global Optimization A Stochastic Approach. Springer, New York (2012)
Holland, J.H.: Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. The MIT Press, Cambridge (1992)
Mitchell, M.: An Introduction to Genetic Algorithms MIT Press, Cambridge (1998)
Haupt, S.E., Haupt, R.L.: Practical Genetic Algorithms, vol. 100. Wiley, Hoboken (2004)
Simon, D.: Evolutionary Optimization Algorithms. Wiley, Hoboken (2013)
Luke, S.: Essentials of metaheuristics. https://cs.gmu.edu/~sean/book/metaheuristics/ (2009)
Yang, X.S.: Review of metaheuristics and generalized evolutionary walk algorithm. J. Bio-Inspired Comput. 3(2), 77–84 (2011)
Serafino, L.: Optimizing without derivatives: what does the no free lunch theorem actually says? Not. AMS 61, 750–755 (2014)
Sudjianto, A., Fang, K.T., Li, R.: Design and Modeling for Computer Experiments. Computer Science & Data Analysis. Chapman & Hall/CRC, London (2005)
Weise, T.: Global Optimization Algorithms. Theory and Application (2011). http://www.it-weise.de/projects/bookNew.pdf
Eiben, A.E., Hinterding, R., Michalewicz, Z.: Parameter control in evolutionary algorithms. IEEE Trans. Evol. Comput. 3, 124–141 (2000)
Hansen, N.: The CMA evolution strategy: a comparing review. In: Towards a New Evolutionary Computation. Advances on Estimation of Distribution Algorithms, pp. 1769–1776. Springer, Berlin (2006)
Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft Comput. 9(1), 3–12 (2005)
Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to Derivative-Free Optimization. Society for Industrial and Applied Mathematics, Philadelphia (2009)
Brochu, E., Cora, V.M., de Freitas, N.: A tutorial on Bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning. CoRR abs/1012.2599 (2010)
Alpcan, T.: A framework for optimization under limited information. J. Global Optim. 55, 681–706 (2013)
Hoffman, M., Brochu, E., de Freitas, N.: Portfolio allocation for Bayesian optimization. UAI, arXiv:1009.5419v2 (2011)
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, corrected edn. Springer, Berlin (2003)
Sørensen, K.: Metaheuristics–the metaphor exposed. Intl. Trans. Oper. Res. 22, 3–18 (2015)
Stefani, M.: Protein folding and misfolding on surfaces. Int. J. Mol. Sci. 9, 2515–2542 (2008)
Weise, T., Zapf, M., Chiong, R., Nebro Urbaneja, A.J.: Why is optimization difficult? In: Chiong, R. (ed.), Nature-Inspired Algorithms for Optimisation, pp. 1–50. Springer, Berlin (2009)
Shan, S., Gary Wang, G.: Survey of modeling and optimization strategies to solve high-dimensional design problems with computationally-expensive black-box functions. Struct. Multidiscip. Optim. 41(2), 219–241 (2010)
Tennem, Y.: A computational intelligence algorithm for expensive engineering optimization problems. Eng. Appl. Artif. Intell. 25(5), 1009–1021 (2012)
Serafino, L.: No free lunch theorem and Bayesian probability theory: two sides of the same coin. Some implications for black-box optimization and metaheuristics (2013). arXiv:cs.LG/1311.6041
Alander, J.T.: An indexed bibliography of genetic algorithms in materials science and engineering (2008). http://lipas.uwasa.fi/TAU/report94-1/gaMSEbib.pdf
El-Mihoub, T.A., Nolle, L., Battersby, A., Hopgood, A.A.: Hybrid genetic algorithms: a review. Eng. Lett. 13(2), 103591 (2006)
Garcia-Martinez, C., Rodriguez, F.J., Lozano, M.: Arbitrary function optimisation with metaheuristics. Soft Comput. 16(12), 2115–2133 (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Serafino, L. (2021). The No Free Lunch Theorem: What Are its Main Implications for the Optimization Practice?. In: Pardalos, P.M., Rasskazova, V., Vrahatis, M.N. (eds) Black Box Optimization, Machine Learning, and No-Free Lunch Theorems. Springer Optimization and Its Applications, vol 170. Springer, Cham. https://doi.org/10.1007/978-3-030-66515-9_12
Download citation
DOI: https://doi.org/10.1007/978-3-030-66515-9_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-66514-2
Online ISBN: 978-3-030-66515-9
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)