Abstract
Optimization algorithms appear in the core calculations of numerous Artificial Intelligence (AI) and Machine Learning methods and Engineering and Business applications. Following recent works on AI’s theoretical deficiencies, a rigour context for the optimization problem of a black-box objective function is developed. The algorithm stems directly from the theory of probability, instead of presumed inspiration. Thus the convergence properties of the proposed methodology are inherently stable. In particular, the proposed optimizer utilizes an algorithmic implementation of the n-dimensional inverse transform sampling as a search strategy. No control parameters are required to be tuned, and the trade-off among exploration and exploitation is, by definition, satisfied. A theoretical proof is provided, concluding that when falling into the proposed framework, either directly or incidentally, any optimization algorithm converges. The numerical experiments verify the theoretical results on the efficacy of the algorithm apropos reaching the sought optimum.
Similar content being viewed by others
Code availability
Available as per Appendix 1.
Abbreviations
- \({\mathbf {x}}^*\) :
-
The argument of f corresponding to the current optimum during the optimization process
- \({\mathbf {x}}_m\) :
-
The sought \({{\text {arg\,min}} }\,f\)
- A :
-
Search space \(\in {{{\mathbb {R}}}^{n}}\)
- h :
-
Convergence history of optimization algorithms
- \(i=\left\{ 1,2,\ldots ,{{f}_{e}} \right\}\) :
-
Iterator for the \(f_e\)
- \(j=\left\{ 1,2,\ldots ,n \right\}\) :
-
Iterator for the dimensions of A
- n :
-
Number of dimensions of the set A
- \({{f}_{e}}\) :
-
The maximum function evaluations
References
Au CK, Leung HF (2012) Eigenspace sampling in the mirrored variant of (1, \(\lambda\))-cma-es. In: 2012 IEEE Congress on Evolutionary Computation, IEEE, pp 1–8
Audet C, Hare W (2017) Derivative-free and blackbox optimization. Springer, New York
Audet C, Kokkolaras M (2016) Blackbox and derivative-free optimization: theory, algorithms and applications. https://doi.org/10.1007/s11081-016-9307-4
Bezanson J, Edelman A, Karpinski S, Shah VB (2017) Julia: a fresh approach to numerical computing. SIAM Rev 59(1):65–98
Bottou L, Curtis FE, Nocedal J (2018) Optimization methods for large-scale machine learning. Siam Rev 60(2):223–311
Bull AD (2011) Convergence rates of efficient global optimization algorithms. J Mach Learn Res 12:2879–2904
Chang BC, Ratnaweera A, Halgamuge SK, Watson HC (2004) Particle swarm optimisation for protein motif discovery. Genet Programm Evol Mach 5(2):203–214
Clayton AD, Manson JA, Taylor CJ, Chamberlain TW, Taylor BA, Clemens G, Bourne RA (2019) Algorithms for the self-optimisation of chemical reactions. React Chem Eng 4(9):1545–1554
Clerc M, Kennedy J (2002) The particle swarm-explosion, stability, and convergence in a multidimensional complex space. IEEE Trans Evol Comput 6(1):58–73
Contributors (2020) Python 3.8.2. https://www.python.org/
Contributors (2020) Gnu octave. http://hg.savannah.gnu.org/hgweb/octave/file/tip/doc/interpreter/contributors.in
Cui H, Guo P, Li M, Guo S, Zhang F (2019) A multi-risk assessment framework for agricultural land use optimization. Stoch Environ Res Risk Assess 33(2):563–579. https://doi.org/10.1007/s00477-018-1610-5
De S, Dey S, Bhattacharyya S (eds) (2020 Recent advances in hybrid metaheuristics for data clustering. ISBN: 978-1-119-55159- 1, Wiley. https://www.wiley.com/en-us/Recent+Advances+in+Hybrid+Metaheuristics+for+Data+Clustering-p-9781119551591
Degasperi A, Fey D, Kholodenko BN (2017) Performance of objective functions and optimisation procedures for parameter estimation in system biology models. NPJ Syst Biol Appl 3(1):1–9
Doerr B (2020a) Probabilistic tools for the analysis of randomized optimization heuristics. In: Doerr B, Neumann F (eds) Theory of evolutionary computation. Springer, New York, pp 1–87. https://doi.org/10.1007/978-3-030-29414-4_1
Doerr C (2020b) Complexity theory for discrete black-box optimization heuristics. In: Doerr B, Neumann F (eds) Theory of evolutionary computation. Springer, New York, pp 133–212. https://doi.org/10.1007/978-3-030-29414-4_3
Feldt R (2013-2018) Blackboxoptim.jl. https://github.com/robertfeldt/BlackBoxOptim.jl
Finck S, Hansen N, Ros R, Auger A (2010) Real-parameter black-box optimization benchmarking 2009: Presentation of the noiseless functions. Tech. rep, Penn State College of Information Sciences and Technology (http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.232.650&rep=rep1&type=pdf)
Gilli M, Schumann E (2012) Heuristic optimisation in financial modelling. Ann Opera Res 193(1):129–158
Hansen N, Auger A, Ros R, Mersmann O, Tušar T, Brockhoff D (2021) Coco: a platform for comparing continuous optimizers in a black-box setting. Optim Methods Softw 36(1):114–144
Hibbert DB (1993) Genetic algorithms in chemistry. Chemom Intell Lab Syst 19(3):277–293
Holden N, Freitas AA (2005) A hybrid particle swarm/ant colony algorithm for the classification of hierarchical biological data. In: Proceedings 2005 IEEE Swarm Intelligence Symposium, 2005. SIS 2005., IEEE, pp 100–107
Hussain K, Salleh MNM, Cheng S, Naseem R (2017) Common benchmark functions for metaheuristic evaluation: A review. JOIV Int J Inform Vis 1(4–2):218–223
Hutson M (2018) AI researchers allege that machine learning is alchemy. Science 360:961. DOI: 10.1126/science.aau0577
Jamil M, Yang XS (2013) A literature survey of benchmark functions for global optimization problems. arXiv preprint arXiv:13084008
Kapoutsis AC, Chatzichristofis SA, Doitsidis L, de Sousa JB, Pinto J, Braga J, Kosmatopoulos EB (2016) Real-time adaptive multi-robot exploration with application to underwater map construction. Auton Robots 40(6):987–1015. https://doi.org/10.1007/s10514-015-9510-8
Kapoutsis AC, Chatzichristofis SA, Kosmatopoulos EB (2019) A distributed, plug-n-play algorithm for multi-robot applications with a priori non-computable objective functions. Int J Robot Res. https://doi.org/10.1177/0278364919845054
Karaboga N, Kalinli A, Karaboga D (2004) Designing digital iir filters using ant colony optimisation algorithm. Eng Appl Artif Intell 17(3):301–309
Lagaros ND, Papadrakakis M, Bakas NP (2006) Automatic minimization of the rigidity eccentricity of 3D reinforced concrete buildings. J Earthq Eng 10(4):533–564. https://doi.org/10.1080/13632460609350609
Lagaros ND, Bakas N, Papadrakakis M (2009) Optimum design approaches for improving the seismic performance of 3D RC buildings. J Earthq Eng 13(3):345–363. https://doi.org/10.1080/13632460802598594
Liang J, Qu B, Suganthan P, Hernández-Díaz AG (2013) Problem definitions and evaluation criteria for the cec 2013 special session on real-parameter optimization. Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou, China and Nanyang Technological University, Singapore, Technical Report 201212(34):281–295
Lin L, Cao L, Wang J, Zhang C (2004) The applications of genetic algorithms in stock market data mining optimisation. Manag Inf Syst. https://www.witpress.com/elibrary/wit-transactions-on-information-and-communication-technologies/33/14241
Moayyeri N, Gharehbaghi S, Plevris V (2019) Cost-based optimum design of reinforced concrete retaining walls considering different methods of bearing capacity computation. Mathematics 7(12):1232
Mogensen PK, Riseth AN (2018) Optim: a mathematical optimization package for Julia. J Open Source Softw 3(24):615
Muñoz MA, Smith-Miles KA (2017) Performance analysis of continuous black-box optimization algorithms via footprints in instance space. Evolut Comput 25(4):529–554
Opara KR, Arabas J (2019) Differential evolution: a survey of theoretical analyses. Swarm Evolut Comput 44:546–558
Papadrakakis M, Lagaros ND, Plevris V (2001) Optimum design of space frames under seismic loading. Int J Struct Stab Dyn 1(01):105–123
Papadrakakis M, Lagaros ND, Plevris V (2005) Design optimization of steel structures considering uncertainties. Engi Struct 27(9):1408–1418
Parker FD (1955) Integrals of inverse functions. Am Math Mon 62(6):439. https://doi.org/10.2307/2307006
Plevris V, Papadrakakis M (2011) A hybrid particle swarm-gradient algorithm for global structural optimization. Comput Aided Civ Infrastruct Eng 26(1):48–68
Ponomareva K, Roman D, Date P (2015) An algorithm for moment-matching scenario generation with application to financial portfolio optimisation. Eur J Oper Res 240(3):678–687
Rudin C (2019) Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead. Nat Mach Intell 1(5):206–215. https://doi.org/10.1038/s42256-019-0048-x
Rudolph G (1994) Convergence analysis of canonical genetic algorithms. IEEE Trans Neural Netw 5(1):96–101
Sculley D, Snoek J, Rahimi A, Wiltschko A (2018) Winner’s Curse? On Pace, Progress, and Empirical Rigor. ICLR Workshop track
Siddique N, Adeli H (2017) Nature-inspired chemical reaction optimisation algorithms. Cogn Comput 9(4):411–422
Sra S, Nowozin S, Wright SJ (2012) Optimization for machine learning. Mit Press, Cambridge
Wang CF, Hu MC, Lee CH, Yu HL (2019a) Optimization of air quality monitoring network based on a spatiotemporal-spectrum manifold analysis. Stoch Environl Res Risk Assess 33(10):1835–1849. https://doi.org/10.1007/s00477-019-01730-x
Wang Y, Liu L, Guo P, Zhang C, Zhang F, Guo S (2019b) An inexact irrigation water allocation optimization model under future climate change. Stoch Environ Res Risk Assess 33(1):271–285. https://doi.org/10.1007/s00477-018-1597-y
Wu J, Poloczek M, Wilson AG, Frazier P (2017) Bayesian optimization with gradients. Adv Neural Inf Process Syst pp 5267–5278. https://papers.nips.cc/paper/2017/hash/64a08e5f1e6c39faeb90108c430eb120-Abstract.html
Acknowledgements
The Authors would like to acknowledge the insightful comments and suggestions of two anonymous Reviewers and the Editor, which significantly helped them to improve the quality of the presented work.
Funding
The contribution of Andreas Langousis has been conducted within the project PerManeNt, which has been co-financed by the European Regional Development Fund of the European Union and Greek National Funds through the Operational Program Competitiveness, Entrepreneurship and Innovation, under the call RESEARCH – CREATE – INNOVATE (project code: T2EDK-04177).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflicts of interest
No.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendices
Appendix 1: Programming code
The corresponding computer code is available at GitHub https://github.com/nbakas/ITSO.jl. The examples of Figure 3 may be reproduced by running __run.jl. The short version of the Algorithm 1 is available in Julia (Bezanson et al. 2017) Language, and particularly in the file ITSO-short.jl, along with Octave (Contributors 2020), in the file ITSOshort.m, and Python (Contributors 2020), in the file ITSO-short.py. The implementation of the framework is integrated in a few lines of computer code, which can be easily adapted for case specific applications with high efficiency.
Appendix 2: Black-Box functions
The following functions were used for the numerical experiments. Equations 20, 21 (Elliptic, Cigar), were utilized from Liang et al. 2013, Cigtab (Eq. 22), Griewank (Eq. 23) from Au and Leung 2012, Quartic (Eq. 24) from Jamil and Yang 2013, Schwefel (Eq. 25), Rastrigin (Eq. 26), Sphere (Eq. 27), and Ellipsoid (Eq. 28) from Finck et al. 2010; Feldt 2013-2018, and Alpine (Eq. 29) from Hussain et al. 2017. Equations 30, 31, 32, were developed by the authors. The code implementation for the selected equations appears in file functions_opti.jl in the supplementary computer code.
The exact variation used in this work is as follows. We have adopted the notation presented in the Nomenclature section, where i denotes the optimization history step, and j the dimension of the design variable \(x_{ij}\).
Rights and permissions
About this article
Cite this article
Bakas, N.P., Plevris, V., Langousis, A. et al. ITSO: a novel inverse transform sampling-based optimization algorithm for stochastic search. Stoch Environ Res Risk Assess 36, 67–76 (2022). https://doi.org/10.1007/s00477-021-02025-w
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00477-021-02025-w