Skip to main content
Log in

A new hybrid classical-quantum algorithm for continuous global optimization problems

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

Grover’s algorithm can be employed in global optimization methods providing, in some cases, a quadratic speedup over classical algorithms. This paper describes a new method for continuous global optimization problems that uses a classical algorithm for finding a local minimum and Grover’s algorithm to escape from this local minimum. Such algorithms will be useful when quantum computers of reasonable size are available. Simulations with testbed functions and comparisons with algorithms from the literature are presented.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Floudas, C., Gounaris, C.: A review of recent advances in global optimization. J. Glob. Opt. 45, 3–38 (2009)

    Article  Google Scholar 

  2. Baritompa, W.P., Bulger, D.W., Wood, G.R.: Grover’s quantum algorithm applied to global optimization. SIAM J. Opt. 15, 1170–1184 (2005)

    Article  Google Scholar 

  3. Bulger, D.W.: Combining a local search and Grover’s algorithm in black-box global optimisation. J. Opt. Theory Appl. 133, 289–301 (2007)

    Article  Google Scholar 

  4. Liu, Y., Koehler, G.J.: Using modifications to Grover’s search algorithm for quantum global optimization. Eur. J. Oper. Res. 207, 620–632 (2010)

    Article  Google Scholar 

  5. Liu, Y., Koehler, G.J.: A hybrid method for quantum global optimization. J. Glob. Opt. 52, 607–626 (2011)

    Article  Google Scholar 

  6. Protopopescu, V., Barhen, J.: Solving a class of continuous global optimization problems using quantum algorithms. Phys. Lett. A 296, 9–14 (2002)

    Article  Google Scholar 

  7. Protopopescu, V., Barhen, J., et al.: Quantum algorithm for continuous global optimization. In: Qi, L. (ed.) Optimization and Control with Applications. Springer, New York (2005)

    Google Scholar 

  8. Dürr, C., Høyer, P.: A quantum algorithm for finding the minimum, http://lanl.arxiv.org/-abs/quant-ph/9607014, (1999)

  9. Boyer, M., Brassard, G., Høyer, P., Tapp, A.: Tight bounds on quantum searching. Fortschritte der Physik 46, 493–506 (1998)

    Article  Google Scholar 

  10. Bulger, D.W.: Quantum basin hopping with gradient-based local optimisation. arXiv:quant-ph/0507193 (2005)

  11. Jordan, S.P.: Fast quantum algorithm for numerical gradient estimation. Phys. Rev. Lett. 95, 050501 (2005)

    Article  Google Scholar 

  12. Grover, L.K.: Quantum mechanics helps in searching for a needle in a haystack. Phys. Rev. Lett. 79, 325–328 (1997)

    Article  Google Scholar 

  13. Kowada, L.A.B., Lavor, C., Portugal, R., Figueiredo, C.H.: A new quantum algorithm to solve the minimum searching problem. Int. J. Quantum Inf. 6, 427–436 (2008)

    Article  Google Scholar 

  14. Bennett, C.H., Bernstein, E., Brassard, G., Vazirani, U.V.: Strengths and weaknesses of quantum computing. SIAM J. Comput. 26(5), 1510–1523 (1997)

    Article  Google Scholar 

  15. Zalka, C.: Grover’s quantum searching algorithm is optimal. Phys. Rev. A 60(4), 2746–2751 (1999)

    Article  Google Scholar 

  16. Hendrix, E.M.T., Klepper, O.: On uniform covering, adaptive random search and raspberries. J. Glob. Opt. 18(2), 143–163 (2000)

    Article  Google Scholar 

  17. Powell, M.J.D.: The BOBYQA algorithm for bound constrained optimization without derivatives. (2009)

  18. Johnson, S.G.: The NLopt nonlinear-optimization package. http://ab-initio.mit.edu/nlopt

  19. Nocedal, J.: Updating quasi-Newton matrices with limited storage. Math. Comput. 35(151), 773–782 (1980)

    Article  Google Scholar 

  20. Liu, D.C., Nocedal, J.: On the limited memory bfgs method for large scale optimization. Math. Program. 45(3), 503–528 (1989)

    Article  Google Scholar 

  21. Dembo, Ron S., Steihaug, Trond: Truncated-newtono algorithms for large-scale unconstrained optimization. Math. Program. 26, 190–212 (1983)

    Article  Google Scholar 

  22. Svanberg, K.: A class of globally convergent optimization methods based on conservative convex separable approximations. SIAM J. Opt. 12(2), 555–573 (2002)

    Article  Google Scholar 

  23. Powell, M.J.D.: Direct search algorithms for optimization calculations. Acta Numerica 7, 287–336 (1998)

    Article  Google Scholar 

  24. Nelder, J.A., Mead, R.: A simplex method for function minimization. Comput. J. 7, 308–313 (1965)

    Article  Google Scholar 

  25. Richardson, JoelA, Kuester, J.L.: Algorithm 454: the complex method for constrained optimization [e4]. Commun. ACM 16(8), 487–489 (1973)

    Article  Google Scholar 

  26. Conn, A.R., Gould, N.I.M., Toint, P.L.: A globally convergent augmented lagrangian algorithm for optimization with general constraints and simple bounds. SIAM J. Numer. Anal. 28(2), 545–572 (1991)

    Article  Google Scholar 

  27. Birgin, E.G., Martínez, J.M.: Improving ultimate convergence of an augmented Lagrangian method. Opt. Methods Softw. 23(2), 177–195 (2008)

    Article  Google Scholar 

  28. Thomas Harvey, R.: Functional stability analysis of numerical algorithms. Ph.D. Thesis, Austin. UMI Order No. GAX90-31702 (1990)

Download references

Acknowledgments

The authors would like to thank FAPESP and CNPq for their financial support. R.P. would like to thank prof. Benjamín Barán for useful suggestions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pedro C. S. Lara.

Appendices

Appendix A: Test functions

Neumaier

$$\begin{aligned} f(x_0,\ldots ,x_{n-1})=\sum \limits _{i=0}^{n-1} (x_i - 1)^2 - \sum \limits _{i=1}^{n-1} x_i x_{i-1}, \quad 0 \le x_i\le 4 \end{aligned}$$

Griewank

$$\begin{aligned} f(x_0,\ldots ,x_{n-1})=\frac{1}{4000} \sum \limits _{i = 0} ^ {n-1} x_i^2 - \prod \limits _{i=0}^{n-1} \cos \left( \frac{x_i}{\sqrt{i+1}} \right) + 1, \quad -40 \le x_i\le 40 \end{aligned}$$

Shekel

$$\begin{aligned} f(x_0,\ldots ,x_{n-1}) = \sum \limits _{i = 0}^{m-1} \frac{1}{c_{i} + \sum \limits _{j = 0}^{n-1} (x_{j} - a_{ji})^2 }, \quad -1 \le x_i\le 1 \end{aligned}$$

Rosenbrock

$$\begin{aligned} f(x_0,\ldots ,x_{n-1}) = \sum \limits _{i=0}^{n-2} (1-x_i)^2+ 100 (x_{i+1} - x_i^2 )^2, \quad -30 \le x_i\le 30 \end{aligned}$$

Michalewicz

$$\begin{aligned} f(x_0,\ldots ,x_{n-1}) =-\sum \limits _{i=0}^{n-1} \sin (x_i) \sin ^{2m}\left( \frac{ i x_i^2}{\pi }\right) , \quad 0 \le x_i\le 10 \end{aligned}$$

Dejong

$$\begin{aligned} f(x_0,\ldots ,x_{n-1}) =\sum \limits _{i=0}^{n-1}x_i^2, \quad -5.12 \le x_i\le 5.12 \end{aligned}$$

Ackley

$$\begin{aligned} f(x_0,\ldots ,x_{n-1})&= -20 \exp \left( -\frac{1}{5} \sqrt{\frac{1}{n} \sum \limits _{i=0}^{n-1} x_i^2} \; \right) -\exp \left( \frac{1}{n} \sum \limits _{i=0}^{n-1} \cos ( 2 \pi x_i) \right) \\&\quad + 20 +\exp (1), \quad -15 \le x_i\le 20 \end{aligned}$$

Schwefel

$$\begin{aligned} f(x_0,\ldots ,x_{n-1}) = -\sum \limits _{i=0}^{n-1} x_i \sin \left( \sqrt{|x_i|}\right) \!, \quad -20 \le x_i\le 20 \end{aligned}$$

Rastrigin

$$\begin{aligned} f(x_0,\ldots ,x_{n-1}) = \sum \limits _{i=0}^{n-1} \left( x_i^2 - 10\cos (2 \pi x_i) + 10 \right) , \quad -5.12 \le x_i\le 5.12 \end{aligned}$$

Raydan

$$\begin{aligned} f(x_0,\ldots ,x_{n-1}) = -\sum \limits _{i=0}^{n-1} \frac{(i+1)}{10}\left( \exp (x_i) - x_i\right) , \quad -5.12 \le x_i\le 5.12 \end{aligned}$$

Appendix B: Standard deviation

This Appendix shows the tables of the standard deviation of the number of evaluations for one, two, and three-variable test functions associated with Tables 1, 2, and 3, respectively. In all tables, the smallest standard deviations are depicted in bold and largest in italic (Tables 6, 7, 8).

Table 6 Standard deviation for one-variable test functions
Table 7 Standard deviation for two-variable test functions
Table 8 Standard deviation for three-variable test functions

Appendix C: Success probability

This Appendix shows the tables of success probability of Algorithm 3 with the termination condition given by Eq. (2) for one, two, and three-variable test functions using the classical optimization routines. In all tables, the largest probability are depicted in bold and lowest in italic. We have performed an average over 100 rounds for each table (Tables 9, 10, 11).

Table 9 Success probability for one-variable test functions
Table 10 Success probability for two-variable test functions
Table 11 Success probability for three-variable test functions

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lara, P.C.S., Portugal, R. & Lavor, C. A new hybrid classical-quantum algorithm for continuous global optimization problems. J Glob Optim 60, 317–331 (2014). https://doi.org/10.1007/s10898-013-0112-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-013-0112-8

Keywords

Navigation