Skip to main content
Log in

Branching and bounding improvements for global optimization algorithms with Lipschitz continuity properties

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

We present improvements to branch and bound techniques for globally optimizing functions with Lipschitz continuity properties by developing novel bounding procedures and parallelisation strategies. The bounding procedures involve nonconvex quadratic or cubic lower bounds on the objective and use estimates of the spectrum of the Hessian or derivative tensor, respectively. As the nonconvex lower bounds are only tractable if solved over Euclidean balls, we implement them in the context of a recent branch and bound algorithm (Fowkes et al. in J Glob Optim 56:1791–1815, 2013) that uses overlapping balls. Compared to the rectangular tessellations of traditional branch and bound, overlapping ball coverings result in an increased number of subproblems that need to be solved and hence makes the need for their parallelization even more stringent and challenging. We develop parallel variants based on both data- and task-parallel paradigms, which we test on an HPC cluster on standard test problems with promising results.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. Note that we need a larger set here as the balls in our overlapping covering extend outside the domain during the initial subdivisions.

  2. Note that \(\fancyscript{B}\) does not need to be convex provided all line segments from \(x_{\fancyscript{B}}\) to \(x\) are contained in \(\fancyscript{B}\), i.e. if \(\fancyscript{B}\) is star-convex with star-centre \(x_{\fancyscript{B}}\).

  3. Note that if \(\fancyscript{B}\) is not assumed to be compact, then (1.2) still holds provided the gradient \(g\) is Lipschitz continuous on the convex subdomain \(\fancyscript{B}\) and \(f \in C^1(\fancyscript{B})\).

  4. Note that one can instead use the alternative definition of \(\ell ^2\)-eigenvalues that are the stationary points of the multilinear Rayleigh quotient for the \(\ell ^2\)-norm, \({Tx^3}/{||x ||_2^3}\) [28] and then the smallest \(\ell ^2\)-eigenvalue of \(T\), \(\lambda ^{\ell ^2}_{\min }(T)\) would be given by \(\lambda ^{\ell ^2}_{\min }(T) = \min _{x \ne 0} {Tx^3}/{||x ||_2^3}\). However, we will not use \(\ell ^2\)-eigenvalues here for reasons that will become clear later.

  5. Note that an analogous result holds for \(\ell ^2\)-eigenvalues. Unfortunately, to the best of our knowledge, there are no known eigenvalue algorithms that are guaranteed to converge to the smallest \(\ell ^2\)-eigenvalue but it is possible to use generalisations of the power method using multiple starting points [22, 41]. However, this is not reliable as (1.3) requires a bound on the smallest eigenvalue and using multiple starting points does not guarantee this. Furthermore, there is no generalisation of Gershgorin’s Theorem for \(\ell ^2\)-eigenvalues, which is what we propose next for \(\ell ^3\)-eigenvalues.

  6. Note that this bound appears in Section 6.2 of Fowkes et al. [16] but it is incorrect there.

  7. Note that if \(\fancyscript{B}\) is not assumed to be compact, then (1.3) still holds provided the Hessian \(H\) is Lipschitz continuous on the convex subdomain \(\fancyscript{B}\) and \(f \in C^2(\fancyscript{B})\).

  8. As correctly pointed out by an anonymous referee (and as used in Algorithm 3.2) one can optionally use \(\underline{f}(\fancyscript{B}) > U_k-\varepsilon \) as the condition for pruning which may allow the algorithm to discard more redundant balls.

  9. Note that \(\fancyscript{R}\) is always a finite set. As the radius is halved each time a ball is split, there can only be a finite number of radii before numerical underflow occurs.

  10. Note that since \(\fancyscript{L}^p\) is a priority queue w.r.t. \(\underline{f}(\fancyscript{B})\), \(\fancyscript{B}\) has the smallest lower bound \(\underline{f}(\fancyscript{B})\) of all balls in \(\fancyscript{L}^p\).

  11. Note that another approach to test the efficiency of parallelism is to calculate the redundancy defined as \((F_P-F_1)/F_P\) when \(F_P>F_1\) and \(0\) otherwise, where \(F_1\) is the number of function evaluations of the serial algorithm and \(F_P\) the number of function evaluations of the parallel algorithm on \(P\) processors (see p. 324 of Strongin and Sergeyev [40]).

  12. Note that the majority of problems in the COCONUT benchmark have nonlinear constraints that our algorithms cannot handle at present. This rather limited the number of problems we could actually test.

References

  1. Alba, E., Almeida, F., Blesa, M., Cotta, C., Díaz, M., Dorta, I., Gabarró, J., León, C., Luque, G., Petit, J., Rodríguez, C., Rojas, A., Xhafa, F.: Efficient parallel LAN/WAN algorithms for optimization. The MALLBA project. Parallel Comput. 32(5–6), 415–440 (2006). doi:10.1016/j.parco.2006.06.007

    Article  Google Scholar 

  2. Ananth, G.Y., Kumar, V., Pardalos, P.M.: Parallel processing of discrete optimization problems. In: Kent, A., Williams, J. (eds.) Encyclopedia of Microcomputers, vol. 13, pp. 129–147, Dekker (1993). http://books.google.co.uk/books?id=Rx3hqGdXcooC

  3. Baritompa, W., Cutler, A.: Accelerations for global optimization covering methods using second derivatives. J. Glob. Optim. 4(3), 329–341 (1994). doi:10.1007/BF01098365

    Article  MATH  MathSciNet  Google Scholar 

  4. Breiman, L., Cutler, A.: A deterministic algorithm for global optimization. Math. Program. 58(1–3), 179–199 (1993). doi:10.1007/BF01581266

    Article  MATH  MathSciNet  Google Scholar 

  5. Cartis, C., Fowkes, J.M., Gould, N.I.M.: Branching and bounding improvements for global optimization algorithms with Lipschitz continuity properties. Tech. rep., Optimization. http://www.optimization-online.org/DB_HTML/2013/06/3914.html (2013)

  6. Casado, L.G., Martìnez, J.A., Garcìa, I., Hendrix, E.M.T.: Branch-and-bound interval global optimization on shared memory multiprocessors. Optim. Methods Softw. 23(5), 689–701 (2008). doi:10.1080/10556780802086300

    Article  MATH  MathSciNet  Google Scholar 

  7. Conn, A.R., Gould, N.I.M., Toint P.L.: Trust region methods. MPS-SIAM series on optimization, SIAM. http://books.google.co.uk/books?id=5kNC4fqssYQC (2000)

  8. Crainic, T.G., Le Cun, B., Roucairol, C.: Parallel branch-and-bound algorithms. In: Talbi, E. (ed.) Parallel Combinatorial Optimization, Wiley Series on Parallel and Distributed Computing, pp. 1–28. Wiley, New York. http://books.google.co.uk/books?id=rYtuk_sm23UC (2006)

  9. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002). doi:10.1007/s101070100263

    Article  MATH  MathSciNet  Google Scholar 

  10. Evtushenko, Y.G.: Numerical methods for finding global extrema (case of a non-uniform mesh). USSR Comput. Math. Math. Phys. 11(6), 38–54 (1971)

    Article  Google Scholar 

  11. Evtushenko, Y.G., Posypkin, M.A.: An application of the nonuniform covering method to global optimization of mixed integer nonlinear problems. Comput. Math. Math. Phys. 51(8), 1286–1298 (2011). doi:10.1134/S0965542511080082

    Article  MathSciNet  Google Scholar 

  12. Evtushenko, Y.G., Posypkin, M.A.: A deterministic approach to global box-constrained optimization. Optim. Lett. 7(4), 819–829 (2013). doi:10.1007/s11590-012-0452-1

    Article  MATH  MathSciNet  Google Scholar 

  13. Evtushenko, Y.G., Posypkin, M.A., Sigal, I.: A framework for parallel large-scale global optimization. Comput. Sci. Res. Dev. 23(3–4), 211–215 (2009). doi:10.1007/s00450-009-0083-7

    Article  Google Scholar 

  14. Floudas, C.: Deterministic Global Optimization: Theory, Methods and Applications. Nonconvex Optimization and Its Applications. Springer, Berlin. http://books.google.co.uk/books?id=qZSpq27TsOcC (1999)

  15. Fowkes, J.M.: Bayesian Numerical Analysis: Global Optimization and Other Applications. PhD thesis, Mathematical Institute, University of Oxford. http://ora.ox.ac.uk/objects/uuid:ab268fe7-f757-459e-b1fe-a4a9083c1cba (2012)

  16. Fowkes, J.M., Gould, N.I.M., Farmer, C.L.: A branch and bound algorithm for the global optimization of Hessian Lipschitz continuous functions. J. Glob. Optim. 56(4), 1791–1815 (2013). doi:10.1007/s10898-012-9937-9

    Article  MATH  MathSciNet  Google Scholar 

  17. Gaviano, M., Lera, D.: A global minimization algorithm for Lipschitz functions. Optim. Lett. 2(1), 1–13 (2008). doi:10.1007/s11590-006-0036-z

    Article  MATH  MathSciNet  Google Scholar 

  18. Gendron, B., Crainic, T.G.: Parallel branch-and-bound algorithms: survey and synthesis. Oper. Res. 42(6), 1042–1066 (1994). doi:10.1287/opre.42.6.1042

    Article  MATH  MathSciNet  Google Scholar 

  19. Grishagin, V.A.: Operating characteristics of some global search algorithms. Probl. Stoch. Search 7, 198–206 (1978). (In Russian)

    MATH  Google Scholar 

  20. Horst, R., Tuy, H.: Global Optimization: Deterministic Approaches. Springer, Berlin. http://books.google.co.uk/books?id=usFjGFvuBDEC (1996)

  21. Knuth, D.: The Art of Computer Programming: Sorting and Searching, the Art of Computer Programming, vol. 3. Addison-Wesley. http://books.google.co.uk/books?id=ePzuAAAAMAAJ (1998)

  22. Kolda, T., Mayo, J.: Shifted power method for computing tensor eigenpairs. SIAM J. Matrix Anal. Appl. 32(4), 1095–1124 (2011). doi:10.1137/100801482

    Article  MATH  MathSciNet  Google Scholar 

  23. Kreinovich, V., Kearfott, R.: Beyond convex? Global optimization is feasible only for convex objective functions: a theorem. J. Glob. Optim. 33(4), 617–624 (2005). doi:10.1007/s10898-004-2120-1

    Article  MATH  MathSciNet  Google Scholar 

  24. Kvasov, D.E., Sergeyev, Y.D.: A univariate global search working with a set of Lipschitz constants for the first derivative. Optim. Lett. 3(2), 303–318 (2009). doi:10.1007/s11590-008-0110-9

    Article  MATH  MathSciNet  Google Scholar 

  25. Kvasov, D.E., Sergeyev, Y.D.: Lipschitz gradients for global optimization in a one-point-based partitioning scheme. J. Comput. Appl. Math. 236(16), 4042–4054 (2012). doi:10.1016/j.cam.2012.02.020

    Article  MATH  MathSciNet  Google Scholar 

  26. Kvasov, D.E., Sergeyev, Y.D.: Univariate geometric Lipschitz global optimization algorithms. Numer. Algebr. Control Optim. 2(1), 69–90 (2012). doi:10.3934/naco.2012.2.69

    Article  MATH  MathSciNet  Google Scholar 

  27. Lera, D., Sergeyev, Y.D.: Acceleration of univariate global optimization algorithms working with Lipschitz functions and Lipschitz first derivatives. SIAM J. Optim. 23(1), 508–529 (2013). doi:10.1137/110859129

    Article  MATH  MathSciNet  Google Scholar 

  28. Lim, L.H.: Singular values and eigenvalues of tensors: a variational approach. In: Computational Advances in Multi-Sensor Adaptive Processing, 2005 1st IEEE International Workshop on, pp. 129–132 (2005). doi:10.1109/CAMAP.2005.1574201

  29. Neumaier, A.: Complete search in continuous global optimization and constraint satisfaction. Acta Numer. 13, 271–369 (2004). doi:10.1017/S0962492904000194

    Article  MathSciNet  Google Scholar 

  30. Pardalos, P.M., Horst, R., Thoai, N.V.: Introduction to Global Optimization, Nonconvex Optimization and its Applications, vol. 3. Springer, Berlin. http://www.springer.com/mathematics/book/978-0-7923-3556-6 (1995)

  31. Paulavičius, R., Žilinskas, J., Grothey, A.: Parallel branch and bound for global optimization with combination of Lipschitz bounds. Optim. Methods Softw. 26(3), 487–498 (2011). doi:10.1080/10556788.2010.551537

    Article  MathSciNet  Google Scholar 

  32. Pintér, J.D.: Global Optimization in Action, Nonconvex Optimization and its Applications, vol. 6. Springer, Berlin. http://www.springer.com/mathematics/book/978-0-7923-3757-7 (1996)

  33. Piyavskii, S.A.: An algorithm for finding the absolute extremum of a function. USSR Comput. Math. Math. Phys. 12(4), 57–67 (1972). doi:10.1016/0041-5553(72)90115-2

    Article  Google Scholar 

  34. Qi, L.: Eigenvalues of a real supersymmetric tensor. J. Symb. Comput. 40(6), 1302–1324 (2005). doi:10.1016/j.jsc.2005.05.007. http://www.sciencedirect.com/science/article/pii/S0747717105000817

  35. Sergeyev, Y.D.: Global one-dimensional optimization using smooth auxiliary functions. Math. Program. 81(1), 127–146 (1998). doi:10.1007/BF01584848

    Article  MATH  MathSciNet  Google Scholar 

  36. Sergeyev, Y.D., Strongin, R.G., Lera, D.: Introduction to Global Optimization Exploiting Space-Filling Curves. Springer Briefs in Optimization, Springer, Berlin. http://books.google.co.uk/books?id=IqyYnAEACAAJ (2013)

  37. Shcherbina, O., Neumaier, A., Sam-Haroud, D., Vu, X.H., Nguyen, T.V.: Benchmarking global optimization and constraint satisfaction codes. In: Bliek, C., Jermann, C., Neumaier, A. (eds.) Global Optimization and Constraint Satisfaction, Lecture Notes in Computer Science, vol. 2861, pp. 211–222. Springer, Berlin, Heidelberg (2003). doi:10.1007/978-3-540-39901-8_16

  38. Shubert, B.: A sequential method seeking the global maximum of a function. SIAM J. Numer. Anal. 9(3), 379–388 (1972). doi:10.1137/0709036

    Article  MATH  MathSciNet  Google Scholar 

  39. Stephens, C.P., Baritompa, W.: Global optimization requires global information. J. Optim. Theory Appl. 96(3), 575–588 (1998). doi:10.1023/A:1022612511618

    Article  MATH  MathSciNet  Google Scholar 

  40. Strongin, R.G., Sergeyev, Y.D.: Global Optimization with Non-convex Constraints: Sequential and Parallel Algorithms. Nonconvex Optimization and its Applications. Springer, Berlin. http://books.google.co.uk/books?id=xh_GF9Dor3AC (2000)

  41. Zhang, X., Qi, L., Ye, Y.: The cubic spherical optimization problems. Math. Comput. 81(279), 1513–1525 (2012). doi:10.1090/S0025-5718-2012-02577-4

    Article  MATH  MathSciNet  Google Scholar 

Download references

Acknowledgments

The work of the first and second authors was supported by EPSRC grants EP/I028854/1 and NAIS EP/G036136/1 and the work of the third author by EP/I013067/1. We are also grateful to NAIS for funding computing time.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jaroslav M. Fowkes.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cartis, C., Fowkes, J.M. & Gould, N.I.M. Branching and bounding improvements for global optimization algorithms with Lipschitz continuity properties. J Glob Optim 61, 429–457 (2015). https://doi.org/10.1007/s10898-014-0199-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-014-0199-6

Keywords

Navigation