Skip to main content
Log in

Evolution strategies: application in hybrid quantum-classical neural networks

  • Published:
Quantum Information Processing Aims and scope Submit manuscript

Abstract

With the rapid development of quantum computers, several applications are being proposed for them. Quantum simulations, simulation of chemical reactions, solution of optimization problems and quantum neural networks (QNNs) are some examples. However, problems such as noise, limited number of qubits and circuit depth, and gradient vanishing must be resolved before we can use them to their full potential. In the field of quantum machine learning, several models have been proposed. In general, in order to train these different models, we use the gradient of a cost function with respect to the model parameters. In order to obtain this gradient, we must compute the derivative of this function with respect to the model parameters. One of the most used methods in the literature to perform this task is the parameter-shift rule method. This method consists of evaluating the cost function twice for each parameter of the QNN. A problem with this method is that the number of evaluations grows linearly with the number of parameters. In this work, we study an alternative method, called evolution strategies (ES), which are a family of black box optimization algorithms which iteratively update the parameters using a search gradient. An advantage of the ES method is that in using it, one can control the number of times the cost function will be evaluated. We apply the ES method to the binary classification task, showing that this method is a viable alternative for training QNNs. However, we observe that its performance will be strongly dependent on the hyperparameters used. Furthermore, we also observe that this method, alike the parameter shift rule method, suffers from the problem of gradient vanishing.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Data availability

The Qiskit/Pytorch code used for implementing the simulations to obtain the data used in this article is available upon request to the authors.

References

  1. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2016)

  2. Szegedy, C. et al.: Going deeper with convolutions. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2015)

  3. Voulodimos, A., Doulamis, N., Doulamis, A., Protopapadakis, E.: Deep learning for computer vision: a brief review. Comput. Intell. Neurosci. 2018, e7068349 (2018)

    Article  Google Scholar 

  4. Devlin, J., Chang, M.-W. , Lee, K., Toutanova, K.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv:1810.04805 (2018)

  5. Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. Adv. Neural Inf. Process. Syst. 27 (2014)

  6. Vamathevan, J., et al.: Applications of machine learning in drug discovery and development. Nat. Rev. Drug Discov. 18, 463 (2019)

    Article  Google Scholar 

  7. Carrasco-Davis, Rodrigo, et al.: Deep learning for image sequence classification of astronomical events. Publ. Astron. Soc. Pac. 131, 108006 (2019)

    Article  ADS  Google Scholar 

  8. Cova, T.F.G.G., Pais, A.A.C.C.: Deep learning for deep chemistry: optimizing the prediction of chemical patterns. Front. Chem. 7, 809 (2019)

    Article  ADS  Google Scholar 

  9. Garg, S., Ramakrishnan, G.: Advances in quantum deep learning: an overview. arXiv:2005.04316 (2020)

  10. Nghiem, N.A., Chen, S.Y.-C., Wei, T.-C.: A unified framework for quantum supervised learning. Phys. Rev. Res. 3(033056), 2020 (2021)

    Google Scholar 

  11. Farhi, E., Neven, H.: Classification with quantum neural networks on near term processors. arXiv:1802.06002 (2018)

  12. Tacchino, F., Barkoutsos, P., Macchiavello, C., Tavernelli, I., Gerace, D., Bajoni, D.: Quantum implementation of an artificial feed-forward neural network. Quantum Sci. Technol. 5, 044010 (2020)

    Article  ADS  Google Scholar 

  13. Verdon, G., Pye, J., Broughton, M.: A universal training algorithm for quantum deep learning. arXiv:1806.09729 (2018)

  14. Beer, K., Bondarenko, D., Farrelly, T., Osborne, T.J., Salzmann, R., Scheiermann, D., Wolf, R.: Training deep quantum neural networks. Nat. Commun. 11, 808 (2020)

    Article  ADS  Google Scholar 

  15. Wei, S., Chen, Y., Zhou, Z., Long, G.: A quantum convolutional neural network on NISQ devices. arXiv:2104.06918 (2021)

  16. S. Lloyd, S., Schuld, M., Ijaz, A., Izaac, J., Killoran, N.: Quantum embeddings for machine learning. arXiv:2001.03622 (2020)

  17. Shao, C.: A quantum model for multilayer perceptron. arXiv:1808.10561 (2018)

  18. Wei, S.J., Chen, Y.H., Zhou, Z.R., Long, G.L.: A quantum convolutional neural network on NISQ devices. AAPPS Bull. 32, 2 (2022)

    Article  ADS  Google Scholar 

  19. Schuld, M.: Supervised quantum machine learning models are kernel methods. arXiv:2101.11020 (2021)

  20. Liu, J., et al.: Hybrid quantum-classical convolutional neural networks. Sci. China Phys. Mech. Astron. 64, 290311 (2021)

    Article  ADS  Google Scholar 

  21. Liang, Y., Peng, W., Zheng, Z.-J., Silvén, O., Zhao, G.: A hybrid quantum-classical neural network with deep residual learning. Neural Netw. 143, 133 (2021)

    Article  Google Scholar 

  22. Xia, R., Kais, S.: Hybrid quantum-classical neural network for calculating ground state energies of molecules. Entropy 22, 828 (2020)

    Article  ADS  MathSciNet  Google Scholar 

  23. Houssein, E.H., Abohashima, Z., Elhoseny, M., Mohamed, W.M.: Hybrid quantum convolutional neural networks model for COVID-19 prediction using chest X-Ray images. J. Comput. Des. Eng. 9, 343 (2022)

    Google Scholar 

  24. Schuld, M., Bergholm, V., Gogolin, C., Izaac, J., Killoran, N.: Evaluating analytic gradients on quantum hardware. Phys. Rev. A 99, 032331 (2019)

    Article  ADS  Google Scholar 

  25. Rechenberg, I. Evolutionsstrategien. Simulationsmethoden in der Medizin und Biologie, pp. 83–114. Springer, Berlin (1978)

  26. Schwefel, H.P.: Numerische Optimierung von Computer-Modellen mittels der Evolutionsstrategie: mit einer vergleichenden Einführung in die Hill-Climbing-und Zufallsstrategie, vol. 1. Birkhäuser, Basel (1977)

    Book  MATH  Google Scholar 

  27. Wierstra, D., et al.: Natural evolution strategies. J. Mach. Learn. Res. 15, 949 (2014)

    MathSciNet  MATH  Google Scholar 

  28. Crooks, G.E.: Gradients of parameterized quantum gates using the parameter-shift rule and gate decomposition. arXiv:1905.13311 (2019)

  29. Salimans, T.: et al., Evolution strategies as a scalable alternative to reinforcement learning. arXiv:1703.03864 (2017)

  30. Yao, J., Bukov, M., Lin, L.: Policy gradient based quantum approximate optimization algorithm. arXiv:2002.01068 (2020)

  31. Anand, A., Degroote, M., Aspuru-Guzik, A.: Natural evolutionary strategies for variational quantum computation. Mach. Learn. Sci. Technol. 2, 045012 (2021)

    Article  Google Scholar 

  32. Wilson, M., Stromswold, S., Wudarski, F., Hadfield, S., Tubman, N.M., Rieffel, E.: Optimizing quantum heuristics with meta-learning. Quantum Mach. Intell. 3, 13 (2021)

    Article  Google Scholar 

  33. Schuld, M., Sweke, R., Meyer, J.J.: The effect of data encoding on the expressive power of variational quantum machine learning models. Phys. Rev. A 103, 032430 (2021)

    Article  ADS  MathSciNet  Google Scholar 

  34. Pérez-Salinas, A., Cervera-Lierta, A., Gil-Fuster, E., Latorre, J.I.: Data re-uploading for a universal quantum classifier. Quantum 4, 226 (2020)

    Article  Google Scholar 

  35. LaRose, R., Coyle, B.: Robust data encodings for quantum classifiers. Phys. Rev. A 102, 032420 (2020)

    Article  ADS  Google Scholar 

  36. Cerezo, M., Sone, A., Volkoff, T., Cincio, L., Coles, P.J.: Cost function dependent barren plateaus in shallow parametrized quantum circuits. Nat. Commun. 12, 1 (2021)

    Article  Google Scholar 

  37. Holmes, Z., Sharma, K., Cerezo, M., Coles, P.J.: Connecting ansatz expressibility to gradient magnitudes and barren plateaus. PRX Quantum 3, 010313 (2022)

    Article  ADS  Google Scholar 

  38. Wang, S., et al.: Noise-induced barren plateaus in variational quantum algorithms. Nat. Commun. 12, 6961 (2021)

    Article  ADS  Google Scholar 

  39. Marrero, C.O., Kieferová, M., Wiebe, N.: Entanglement-induced barren plateaus. PRX Quantum 2, 040316 (2021)

    Article  Google Scholar 

  40. Patti, T.L., Najafi, K., Gao, X., Yelin, S.F.: Entanglement devised barren plateau mitigation. Phys. Rev. Res. 3, 033090 (2021)

    Article  Google Scholar 

  41. Arrasmith, A., Cerezo, M., Czarnik, P., Cincio, L., Coles, P.J.: Effect of barren plateaus on gradient-free optimization. Quantum 5, 558 (2021)

    Article  Google Scholar 

  42. Friedrich, L., Maziero, J.: Avoiding barren plateaus with classical deep neural networks. Phys. Rev. A 106, 042433 (2022)

    Article  ADS  MathSciNet  Google Scholar 

  43. Grant, E., Wossnig, L., Ostaszewski, M., Benedetti, M.: An initialization strategy for addressing barren plateaus in parametrized quantum circuits. Quantum 3, 214 (2019)

    Article  Google Scholar 

  44. Volkoff, T., Coles, P.J.: Large gradients via correlation in random parameterized quantum circuits. Quantum Sci. Technol. 6, 025008 (2021)

    Article  ADS  Google Scholar 

  45. Verdon, G. et al.: Learning to learn with quantum neural networks via classical neural networks. arXiv:1907.05415 (2019)

  46. Skolik, A., McClean, J.R., Mohseni, M., van der Smagt, P., Leib, M.: Layerwise learning for quantum neural networks. Quantum Mach. Intell. 3, 5 (2021)

    Article  Google Scholar 

  47. McKay, D.C., et al.: Qiskit backend specifications for OpenQASM and OpenPulse experiments. arXiv:1809.03452 (2018)

  48. Paszke, A., et al.: PyTorch: an imperative style, high-performance deep learning library. arXiv:1912.01703 (2019)

Download references

Acknowledgements

This work was supported by the Foundation for Research Support of the State of Rio Grande do Sul (FAPERGS), by the National Institute for the Science and Technology of Quantum Information (INCT-IQ), process 465469/2014-0, and by the National Council for Scientific and Technological Development (CNPq), process 309862/2021-3.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jonas Maziero.

Ethics declarations

Conflict of interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Friedrich, L., Maziero, J. Evolution strategies: application in hybrid quantum-classical neural networks. Quantum Inf Process 22, 132 (2023). https://doi.org/10.1007/s11128-023-03876-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11128-023-03876-8

Keywords

Navigation