Abstract
This paper proposes novel algorithm for non-convex multimodal constrained optimisation problems. It is based on sequential solving restrictions of problem to sections of feasible set by random subspaces (in general, manifolds) of low dimensionality. This approach varies in a way to draw subspaces, dimensionality of subspaces, and method to solve restricted problems. We provide empirical study of algorithm on convex, unimodal and multimodal optimisation problems and compare it with efficient algorithms intended for each class of problems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Cartis, C., Fowkes, J., Shao, Z.: Randomised subspace methods for non-convex optimization, with applications to nonlinear least-squares. arXiv preprint arXiv:2211.09873 (2022)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149ā154 (1964)
Gavana, A.: Global optimization benchmarks and ampgo (2018)
Gorbunov, E., Bibi, A., Sener, O., Bergou, E.H., RichtƔrik, P.: A stochastic derivative free optimization method with momentum. arXiv preprint arXiv:1905.13278 (2019)
Gower, R., Kovalev, D., Lieder, F., RichtƔrik, P.: RSN: randomized subspace Newton. In: Advances in Neural Information Processing Systems 32 (2019)
Gower, R.M., RichtĆ”rik, P., Bach, F.: Stochastic quasi-gradient methods: Variance reduction via Jacobian sketching. Math. Program. 188, 135ā192 (2021)
Grishchenko, D., Iutzeler, F., Malick, J.: Proximal gradient methods with adaptive subspace sampling. Math. Oper. Res. 46(4), 1303ā1323 (2021)
Jones, D.R.: A taxonomy of global optimization methods based on response surfaces. J. Global Optim. 21, 345ā383 (2001)
Kozak, D., Becker, S., Doostan, A., Tenorio, L.: A stochastic subspace approach to gradient-free optimization in high dimensions. Comput. Optim. Appl. 79(2), 339ā368 (2021)
Leary, R.H.: Global optimization on funneling landscapes. J. Global Optim. 18(4), 367 (2000)
Nelder, J.A., Mead, R.: A simplex method for function minimization. Comput. J. 7(4), 308ā313 (1965)
Nemirovskij, A.S., Yudin, D.B.: Problem complexity and method efficiency in optimization. Wiley-Interscience (1983)
Nesterov, Y., Spokoiny, V.: Random gradient-free minimization of convex functions. Found. Comput. Math. 17, 527ā566 (2017)
Nesterov, Y.E.: Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM J. Optim. 22(2), 341ā362 (2012)
Pasechnyuk, D.A., Gornov, A.: Solar method for non-convex problems: hypothesizing approach to an optimization (2022). http://dmivilensky.ru/preprints/Solar%20method%20for%20non-convex%20problems.pdf
Patrascu, A., Necoara, I.: Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization. J. Global Optim. 61(1), 19ā46 (2015)
Polak, E., Ribiere, G.: Note sur la convergence de mĆ©thodes de directions conjuguĆ©es. Revue franƧaise dāinformatique et de recherche opĆ©rationnelle. SĆ©rie rouge 3(16), 35ā43 (1969)
Polyak, B.T.: The conjugate gradient method in extremal problems. USSR Comput. Math. Math. Phys. 9(4), 94ā112 (1969)
Powell, M.J.D.: Restart procedures for the conjugate gradient method. Math. Program. 12, 241ā254 (1977)
RichtĆ”rik, P., TakĆ”Ä, M.: Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function. Math. Program. 144(1ā2), 1ā38 (2014)
Tƶrn, A., Zilinskas, A.: Global optimization, vol.Ā 350. Springer (1989)
Wales, D.J., Doye, J.P.: Global optimization by basin-hopping and the lowest energy structures of lennard-jones clusters containing up to 110 atoms. J. Phys. Chem. A 101(28), 5111ā5116 (1997)
Xu, Y., Yin, W.: A globally convergent algorithm for nonconvex optimization based on block coordinate update. J. Sci. Comput. 72(2), 700ā734 (2017)
Xue, G.L.: Parallel two-level simulated annealing. In: Proceedings of the 7th International Conference on Supercomputing, pp. 357ā366 (1993)
Zhigljavsky, A.A.: Theory of global random search, vol.Ā 65. Springer Science & Business Media (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
Ā© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Pasechnyuk, D.A., Gornov, A. (2024). A Randomised Non-descent Method forĀ Global Optimisation. In: Olenev, N., Evtushenko, Y., JaÄimoviÄ, M., Khachay, M., Malkova, V. (eds) Advances in Optimization and Applications. OPTIMA 2023. Communications in Computer and Information Science, vol 1913. Springer, Cham. https://doi.org/10.1007/978-3-031-48751-4_1
Download citation
DOI: https://doi.org/10.1007/978-3-031-48751-4_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-48750-7
Online ISBN: 978-3-031-48751-4
eBook Packages: Computer ScienceComputer Science (R0)