Abstract
With the rapid development of quantum computers, several applications are being proposed for them. Quantum simulations, simulation of chemical reactions, solution of optimization problems and quantum neural networks (QNNs) are some examples. However, problems such as noise, limited number of qubits and circuit depth, and gradient vanishing must be resolved before we can use them to their full potential. In the field of quantum machine learning, several models have been proposed. In general, in order to train these different models, we use the gradient of a cost function with respect to the model parameters. In order to obtain this gradient, we must compute the derivative of this function with respect to the model parameters. One of the most used methods in the literature to perform this task is the parameter-shift rule method. This method consists of evaluating the cost function twice for each parameter of the QNN. A problem with this method is that the number of evaluations grows linearly with the number of parameters. In this work, we study an alternative method, called evolution strategies (ES), which are a family of black box optimization algorithms which iteratively update the parameters using a search gradient. An advantage of the ES method is that in using it, one can control the number of times the cost function will be evaluated. We apply the ES method to the binary classification task, showing that this method is a viable alternative for training QNNs. However, we observe that its performance will be strongly dependent on the hyperparameters used. Furthermore, we also observe that this method, alike the parameter shift rule method, suffers from the problem of gradient vanishing.
Similar content being viewed by others
Data availability
The Qiskit/Pytorch code used for implementing the simulations to obtain the data used in this article is available upon request to the authors.
References
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2016)
Szegedy, C. et al.: Going deeper with convolutions. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2015)
Voulodimos, A., Doulamis, N., Doulamis, A., Protopapadakis, E.: Deep learning for computer vision: a brief review. Comput. Intell. Neurosci. 2018, e7068349 (2018)
Devlin, J., Chang, M.-W. , Lee, K., Toutanova, K.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv:1810.04805 (2018)
Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. Adv. Neural Inf. Process. Syst. 27 (2014)
Vamathevan, J., et al.: Applications of machine learning in drug discovery and development. Nat. Rev. Drug Discov. 18, 463 (2019)
Carrasco-Davis, Rodrigo, et al.: Deep learning for image sequence classification of astronomical events. Publ. Astron. Soc. Pac. 131, 108006 (2019)
Cova, T.F.G.G., Pais, A.A.C.C.: Deep learning for deep chemistry: optimizing the prediction of chemical patterns. Front. Chem. 7, 809 (2019)
Garg, S., Ramakrishnan, G.: Advances in quantum deep learning: an overview. arXiv:2005.04316 (2020)
Nghiem, N.A., Chen, S.Y.-C., Wei, T.-C.: A unified framework for quantum supervised learning. Phys. Rev. Res. 3(033056), 2020 (2021)
Farhi, E., Neven, H.: Classification with quantum neural networks on near term processors. arXiv:1802.06002 (2018)
Tacchino, F., Barkoutsos, P., Macchiavello, C., Tavernelli, I., Gerace, D., Bajoni, D.: Quantum implementation of an artificial feed-forward neural network. Quantum Sci. Technol. 5, 044010 (2020)
Verdon, G., Pye, J., Broughton, M.: A universal training algorithm for quantum deep learning. arXiv:1806.09729 (2018)
Beer, K., Bondarenko, D., Farrelly, T., Osborne, T.J., Salzmann, R., Scheiermann, D., Wolf, R.: Training deep quantum neural networks. Nat. Commun. 11, 808 (2020)
Wei, S., Chen, Y., Zhou, Z., Long, G.: A quantum convolutional neural network on NISQ devices. arXiv:2104.06918 (2021)
S. Lloyd, S., Schuld, M., Ijaz, A., Izaac, J., Killoran, N.: Quantum embeddings for machine learning. arXiv:2001.03622 (2020)
Shao, C.: A quantum model for multilayer perceptron. arXiv:1808.10561 (2018)
Wei, S.J., Chen, Y.H., Zhou, Z.R., Long, G.L.: A quantum convolutional neural network on NISQ devices. AAPPS Bull. 32, 2 (2022)
Schuld, M.: Supervised quantum machine learning models are kernel methods. arXiv:2101.11020 (2021)
Liu, J., et al.: Hybrid quantum-classical convolutional neural networks. Sci. China Phys. Mech. Astron. 64, 290311 (2021)
Liang, Y., Peng, W., Zheng, Z.-J., Silvén, O., Zhao, G.: A hybrid quantum-classical neural network with deep residual learning. Neural Netw. 143, 133 (2021)
Xia, R., Kais, S.: Hybrid quantum-classical neural network for calculating ground state energies of molecules. Entropy 22, 828 (2020)
Houssein, E.H., Abohashima, Z., Elhoseny, M., Mohamed, W.M.: Hybrid quantum convolutional neural networks model for COVID-19 prediction using chest X-Ray images. J. Comput. Des. Eng. 9, 343 (2022)
Schuld, M., Bergholm, V., Gogolin, C., Izaac, J., Killoran, N.: Evaluating analytic gradients on quantum hardware. Phys. Rev. A 99, 032331 (2019)
Rechenberg, I. Evolutionsstrategien. Simulationsmethoden in der Medizin und Biologie, pp. 83–114. Springer, Berlin (1978)
Schwefel, H.P.: Numerische Optimierung von Computer-Modellen mittels der Evolutionsstrategie: mit einer vergleichenden Einführung in die Hill-Climbing-und Zufallsstrategie, vol. 1. Birkhäuser, Basel (1977)
Wierstra, D., et al.: Natural evolution strategies. J. Mach. Learn. Res. 15, 949 (2014)
Crooks, G.E.: Gradients of parameterized quantum gates using the parameter-shift rule and gate decomposition. arXiv:1905.13311 (2019)
Salimans, T.: et al., Evolution strategies as a scalable alternative to reinforcement learning. arXiv:1703.03864 (2017)
Yao, J., Bukov, M., Lin, L.: Policy gradient based quantum approximate optimization algorithm. arXiv:2002.01068 (2020)
Anand, A., Degroote, M., Aspuru-Guzik, A.: Natural evolutionary strategies for variational quantum computation. Mach. Learn. Sci. Technol. 2, 045012 (2021)
Wilson, M., Stromswold, S., Wudarski, F., Hadfield, S., Tubman, N.M., Rieffel, E.: Optimizing quantum heuristics with meta-learning. Quantum Mach. Intell. 3, 13 (2021)
Schuld, M., Sweke, R., Meyer, J.J.: The effect of data encoding on the expressive power of variational quantum machine learning models. Phys. Rev. A 103, 032430 (2021)
Pérez-Salinas, A., Cervera-Lierta, A., Gil-Fuster, E., Latorre, J.I.: Data re-uploading for a universal quantum classifier. Quantum 4, 226 (2020)
LaRose, R., Coyle, B.: Robust data encodings for quantum classifiers. Phys. Rev. A 102, 032420 (2020)
Cerezo, M., Sone, A., Volkoff, T., Cincio, L., Coles, P.J.: Cost function dependent barren plateaus in shallow parametrized quantum circuits. Nat. Commun. 12, 1 (2021)
Holmes, Z., Sharma, K., Cerezo, M., Coles, P.J.: Connecting ansatz expressibility to gradient magnitudes and barren plateaus. PRX Quantum 3, 010313 (2022)
Wang, S., et al.: Noise-induced barren plateaus in variational quantum algorithms. Nat. Commun. 12, 6961 (2021)
Marrero, C.O., Kieferová, M., Wiebe, N.: Entanglement-induced barren plateaus. PRX Quantum 2, 040316 (2021)
Patti, T.L., Najafi, K., Gao, X., Yelin, S.F.: Entanglement devised barren plateau mitigation. Phys. Rev. Res. 3, 033090 (2021)
Arrasmith, A., Cerezo, M., Czarnik, P., Cincio, L., Coles, P.J.: Effect of barren plateaus on gradient-free optimization. Quantum 5, 558 (2021)
Friedrich, L., Maziero, J.: Avoiding barren plateaus with classical deep neural networks. Phys. Rev. A 106, 042433 (2022)
Grant, E., Wossnig, L., Ostaszewski, M., Benedetti, M.: An initialization strategy for addressing barren plateaus in parametrized quantum circuits. Quantum 3, 214 (2019)
Volkoff, T., Coles, P.J.: Large gradients via correlation in random parameterized quantum circuits. Quantum Sci. Technol. 6, 025008 (2021)
Verdon, G. et al.: Learning to learn with quantum neural networks via classical neural networks. arXiv:1907.05415 (2019)
Skolik, A., McClean, J.R., Mohseni, M., van der Smagt, P., Leib, M.: Layerwise learning for quantum neural networks. Quantum Mach. Intell. 3, 5 (2021)
McKay, D.C., et al.: Qiskit backend specifications for OpenQASM and OpenPulse experiments. arXiv:1809.03452 (2018)
Paszke, A., et al.: PyTorch: an imperative style, high-performance deep learning library. arXiv:1912.01703 (2019)
Acknowledgements
This work was supported by the Foundation for Research Support of the State of Rio Grande do Sul (FAPERGS), by the National Institute for the Science and Technology of Quantum Information (INCT-IQ), process 465469/2014-0, and by the National Council for Scientific and Technological Development (CNPq), process 309862/2021-3.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Friedrich, L., Maziero, J. Evolution strategies: application in hybrid quantum-classical neural networks. Quantum Inf Process 22, 132 (2023). https://doi.org/10.1007/s11128-023-03876-8
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s11128-023-03876-8