Fully Complex-Valued Wirtinger Conjugate Neural Networks with Generalized Armijo Search

  • Bingjie Zhang
  • Junze Wang
  • Shujun Wu
  • Jian Wang
  • Huaqing ZhangEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10956)


Conjugate gradient (CG) method has been verified to be one effective strategy for training neural networks in terms of its low memory requirements and fast convergence. In this paper, an efficient CG method is proposed to train fully complex neural networks based on Wirtinger calculus. We adopt two ways to enhance the training performance. One is to construct a sufficient descent direction during training by designing a fine tuning conjugate coefficient. Another technique is to pursue the optimal learning rate instead of a fixed constance in each iteration which is determined by employing a generalized Armijo search. To verify the effectiveness and the convergent behavior of the proposed algorithm, the illustrated simulation has been performed on the complex benchmark noncircular signal.


Complex network Conjugate gradient Armijo search Wirtinger 



This work was supported in part by the National Natural Science Foundation of China (No. 61305075), the Natural Science Foundation of Shandong Province (No. ZR2015A-L014, ZR201709220208) and the Fundamental Research Funds for the Central Universities (No. 15CX08011A, 18CX02036A).


  1. 1.
    Chen, S., Hong, X., Khalaf, E., Alsaadi, F.E., Harris, C.J.: Complex-valued B-spline neural network and its application to iterative frequency-domain decision feedback equalization for Hammerstein communication systems. In: 2016 International Joint Conference on Neural Networks, pp. 4097–4104 (2016)Google Scholar
  2. 2.
    Liu, Y.S., Huang, H., Huang, T.W., Qian, X.S.: An improved maximum spread algorithm with application to complex-valued RBF neural networks. Neurocomputing 216, 261–267 (2016)CrossRefGoogle Scholar
  3. 3.
    Fink, O., Zio, E., Weidmann, U.: Predicting component reliability and level of degradation with complex-valued neural networks. Reliab. Eng. Syst. Saf. 121, 198–206 (2014)CrossRefGoogle Scholar
  4. 4.
    Aizenberg, I.: Complex-Valued Neural Networks with Multivalued Neurons. Springer, Heidelberg (2011). Scholar
  5. 5.
    Nitta, T.: Solving the XOR problem and the detection of symmetry using a single complex-valued neuron. Neural Netw. 16, 1101–1105 (2003)CrossRefGoogle Scholar
  6. 6.
    Kim, T., Adali, T.: Approximation by fully complex multilayer perceptrons. Neural Comput. 15, 1641–1666 (2003)CrossRefGoogle Scholar
  7. 7.
    Savitha, R., Suresh, S., Sundararajan, N.: Metacognitive learning in a fully complex-valued radial basis function neural network. Neural Comput. 24, 1297–1328 (2012)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Li, M., Huang, G., Saratchandran, P., Sundararajan, N.: Fully complex extreme learning machine. Neurocomputing 68, 306–314 (2005)CrossRefGoogle Scholar
  9. 9.
    Nitta, T.: An extension of the back-propagation algorithm to complex numbers. Neural Netw. 10, 1391–1415 (1997)CrossRefGoogle Scholar
  10. 10.
    Zhao, Z.Z., Xu, Q.S., Jia, M.P.: Improved shuffled frog leaping algorithm-based BP neural network and its application in bearing early fault diagnosis. Neural Comput. Appl. 27, 375–385 (2016)CrossRefGoogle Scholar
  11. 11.
    Xie, L.: The heat load prediction model based on BP neural network-markov model. Procedia Comput. Sci. 107, 296–300 (2017)CrossRefGoogle Scholar
  12. 12.
    Li, Z.K., Zhao, X.H.: BP artificial neural network based wave front correction for sensor-less free space optics communication. Opt. Commun. 385, 219–228 (2017)CrossRefGoogle Scholar
  13. 13.
    Li, H., Adali, T.: Complex-valued adaptive signal processing using nonlinear functions. EURASIP J. Adv. Sig. Process. 2008, 1–9 (2008)zbMATHGoogle Scholar
  14. 14.
    Zhang, H.S., Liu, X.D., Xu, D.P., Zhang, Y.: Convergence analysis of fully complex backpropagation algorithm based on Wirtinger calculus. Cogn. Neurodyn. 8(3), 261–266 (2014)CrossRefGoogle Scholar
  15. 15.
    Xu, D.P., Zhang, H.S., Mandic, D.P.: Convergence analysis of an augmented algorithm for fully complex-valued neural networks. Neural Netw. 69, 44–50 (2015)CrossRefGoogle Scholar
  16. 16.
    Zhang, H.S., Xu, D.P., Zhang, Y.: Boundedness and convergence of split-complex back-propagation algorithm with momentum and penalty. Neural Process Lett. 39(3), 297–307 (2014)CrossRefGoogle Scholar
  17. 17.
    Zhang, H.S., Zhang, C., Wu, W.: Convergence of batch split-complex backpropagation algorithm for complex-valued neural networks. Discrete Dyn. Nat. Soc. 1–16 (2009)zbMATHGoogle Scholar
  18. 18.
    Papalexopoulos, A.D., Hao, S.Y., Peng, T.M.: An implementation of a neural-network-based load forecasting-model for the EMS. IEEE Trans. Power Syst. 9, 1956–1962 (1994)CrossRefGoogle Scholar
  19. 19.
    Lu, C.N., Wu, H.T., Vemuri, S.: Neural network based short-term load forecasting. Trans. Power Syst. 8, 336–342 (1993)CrossRefGoogle Scholar
  20. 20.
    Saini, L.M., Soni, M.K.: Artificial neural network-based peak load forecasting using conjugate gradient methods. IEEE Trans. Power Syst. 17, 907–912 (2002)CrossRefGoogle Scholar
  21. 21.
    Goodband, J.H., Haas, O.C.L., Mills, J.A.: A comparison of neural network approaches for on-line prediction in IGRT. Med. Phys. 35, 1113–1122 (2008)CrossRefGoogle Scholar
  22. 22.
    Hagan, M.T., Demuth, H.B., Beale, M.H.: Neural Network Design. PWS Publisher, Boston (1996)Google Scholar
  23. 23.
    Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (2006). Scholar
  24. 24.
    Hestenes, M.R., Stiefel, E.L.: Method of Conjugate Gradients for Solving Linear Systems. National Bureau of Standards, Washington (1952)zbMATHGoogle Scholar
  25. 25.
    Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)MathSciNetCrossRefGoogle Scholar
  26. 26.
    Liu, H., Li, X.Y.: A modified HS conjugate gradient method. In: 2011 International Conference on Multimedia Technology, pp. 5699–5702 (2011)Google Scholar
  27. 27.
    Polak, E., Ribiere, G.: Note sur la convergence de directions conjugates. Rev. Francaise d’Informatique et de Rech. Operationnelle 16, 35–43 (1969)Google Scholar
  28. 28.
    Polyak, B.T.: The conjugate gradient method in extremal problems. USSR Comput. Math. Math. Phys. 9, 94–112 (1969)CrossRefGoogle Scholar
  29. 29.
    Wan, Z., Hu, C., Yang, Z.: A spectral PRP conjugate gradient methods for nonconvex optimization problem based on modified line search. Discrete Cont. Dyn. Syst. Ser. B 16, 1157–1169 (2017)MathSciNetCrossRefGoogle Scholar
  30. 30.
    Sun, Q.Y., Liu, X.H.: Global convergence results of a new three terms conjugate gradient method with generalized armijo step size rule. Math. Numer. Sinica 26, 25–36 (2004)MathSciNetGoogle Scholar
  31. 31.
    Magoulas, G.D., Vrahatis, M.N., Androulakis, G.S.: Effective backpropagation training with variable stepsize. Neural Netw. 10, 69–82 (1997)CrossRefGoogle Scholar
  32. 32.
    Wang, A.P., Chen, Z.: Global convergence of a modified HS conjugate gradient method under wolfe-type line search. J. Anhui Univ. 2, 150–156 (2015)zbMATHGoogle Scholar
  33. 33.
    Dong, X.L., Yang, X.M., Huang, Y.Y.: Global convergence of a new conjugate gradient method with Armijo search. J. Henan Normal Univ. 43(6), 25–29 (2015)zbMATHGoogle Scholar
  34. 34.
    Wang, J., Zhang, B.J., Sun, Z.Q., Hao, W.X., Sun, Q.Y.: A novel conjugate gradient method with generalized Armijo search for efficient training of feedforward neural networks. Neurocomputing 275, 308–316 (2018)CrossRefGoogle Scholar
  35. 35.
    Needham, T.: Visual complex analysis. Am. Math. Mon. 105, 195–196 (1998)MathSciNetCrossRefGoogle Scholar
  36. 36.
    Novey, M.P.: Complex ICA using nonlinear functions. IEEE Trans. Sig. Process. 56(9), 4536–4544 (2008)MathSciNetCrossRefGoogle Scholar
  37. 37.
    Wirtinger, W.: Zur formalen theorie der funktionen von mehr komplexen ver盲nderlichen. Mathematische Annalen 97, 357–375 (1927)Google Scholar
  38. 38.
    Kreutz-Delgado, K.: The complex gradient operator and the CR-calculus. Mathematics 1–74 (2009)Google Scholar
  39. 39.
    Mandic, D.P., Goh, S.L.: Complex Valued Nonlinear Adaptive Filters: Noncircularity. Widely Linear and Neural Models. Wiley, Hoboken (2009)CrossRefGoogle Scholar
  40. 40.
    Brandwood, D.H.: A complex gradient operator and its application in adaptive array theory. In: IEE Proceedings H - Microwaves, Optics and Antennas, vol. 130, pp. 11–16 (1983)MathSciNetCrossRefGoogle Scholar
  41. 41.
    Orozco-Henao, C., Bretas, A.S., Chouhy-Leborgne, R., Herrera-Orozco, A.R., Marín-Quintero, J.: Active distribution network fault location methodology: a minimum fault reactance and Fibonacci search approach. Electr. Power Energy Syst. 84, 232–241 (2017)CrossRefGoogle Scholar
  42. 42.
    Vieira, D.A.G., Lisboa, A.C.: Line search methods with guaranteed asymptotical convergence to an improving local optimum of multimodal functions. Eur. J. Oper. Res. 235, 38–46 (2014)MathSciNetCrossRefGoogle Scholar
  43. 43.
    Xia, Y., Jelfs, B., Hulle, M.M.V., Principe, J.C., Mandic, D.P.: An augmented echo state network for nonlinear adaptive filtering of complex noncircular signals. IEEE Trans. Neural Netw. 22, 74–83 (2011)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Bingjie Zhang
    • 1
  • Junze Wang
    • 2
  • Shujun Wu
    • 1
  • Jian Wang
    • 1
  • Huaqing Zhang
    • 1
    Email author
  1. 1.College of ScienceChina University of PetroleumQingdaoChina
  2. 2.College of Computer and Communication EngineeringChina University of PetroleumQingdaoChina

Personalised recommendations