Abstract
In recent years, matrix-valued optimization algorithms have been studied to enhance the computational performance of vector-valued optimization algorithms. This paper presents two matrix-type projection neural networks, continuous-time and discrete-time models, for solving matrix-valued optimization problems. The proposed continuous-time neural network may be viewed as a significant extension to the vector-type double projection neural network. More importantly, the proposed discrete-time projection neural network can be parallelly implemented in terms of matrix state space. Under pseudo-monotonicity condition and Lipschitz continuous condition, it is guaranteed that the two proposed matrix-type projection neural networks are globally convergent to the optimal solution. Finally, computed examples show that the two proposed matrix-type projection neural networks are much superior to the vector-type projection neural network in computation speed.
This work is supported by the National Natural Science Foundation of China under Grant No. 61473330.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Kalouptsidis, N.: Signal Processing Systems: Theory and Design. Wiley-Interscience, New York (1997)
Mohammed, J.L., Hummel, R.A., Zucker, S.W.: A gradient projection algorithm for relaxation methods. IEEE Trans. Pattern Anal. Mach. Intell. 5, 330–332 (1983)
Grant, M., Boyd, S., Ye, Y.: Disciplined Convex Programming. Springer, Boston (2006). https://doi.org/10.1007/0-387-30528-9_7
Vanderbei, R.J., Shanno, D.F.: An interior-point algorithm for nonconvex nonlinear programming. Comput. Optim. Appl. 13, 231–252 (1999)
Xia, Y.: A new neural network for solving linear programming problems and its application. IEEE Trans. Neural Netw. 7, 525–529 (1996)
Xia, Y., Wang, J.: A recurrent neural network for nonlinear convex optimization subject to nonlinear inequality constraints. IEEE Trans. Circ. Syst. I Regul. Pap. 51, 1385–1394 (2004)
Xia, Y.: A compact cooperative recurrent neural network for computing general constrained \(L_1\) norm estimators. IEEE Trans. Sig. Process. 57(9), 3693–3697 (2009)
Xia, Y., Wang, J.: A general methodology for designing globally convergent optimization neural networks. IEEE Trans. Neural Netw. 9, 1331–1343 (1998)
Liu, Q., Wang, J.: A one-layer recurrent neural network for non-smooth convex optimization subject to linear equality constraints. In: Köppen, M., Kasabov, N., Coghill, G. (eds.) ICONIP 2008. LNCS, vol. 5507, pp. 1003–1010. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-642-03040-6_122
Kennedy, M.P., Chua, L.O.: Neural networks for nonlinear programming. IEEE Trans. Circ. Syst. 35, 554–562 (1988)
Rodriguez-Vazquez, A., Dominguez-Castro, R., Rueda, A., Huertas, J.L.: Nonlinear switched capacitor neural networks for optimization problems. IEEE Trans. Circ. Syst. 37, 384–398 (1990)
Zhang, S., Constantinides, A.G.: Lagrange programming neural networks. IEEE Trans. Circ. Syst. II Analog Digit. Sig. Process. 39, 441–452 (1992)
Xia, Y., Leung, H., Wang, J.: A projection neural network and its application to constrained optimization problems. IEEE Trans. Circ. Syst. I Fundam. Theory Appl. 49, 447–458 (2002)
Xia, Y., Feng, G., Wang, J.: A novel recurrent neural network for solving nonlinear optimization problems with inequality constraints. IEEE Trans. Neural Netw. 19, 1340–53 (2008)
Xia, Y.: An extended projection neural network for constrained optimization. Neural Comput. 16, 863–883 (2004)
Xia, Y., Wang, J.: Solving variational inequality problems with linear constraints based on a novel recurrent neural network. In: Liu, D., Fei, S., Hou, Z., Zhang, H., Sun, C. (eds.) ISNN 2007. LNCS, vol. 4493, pp. 95–104. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-72395-0_13
Xia, Y., Wang, J.: A bi-projection neural network for solving constrained quadratic optimization problems. IEEE Trans. Neural Netw. Learn. Syst. 27, 214–224 (2016)
Xia, Y.S.: New cooperative projection neural network for nonlinearly constrained variational inequality. Sci. China 52, 1766–1777 (2009)
Cheng, L., Hou, Z.G., Lin, Y., et al.: Recurrent neural network for non-smooth convex optimization problems with application to the identification of genetic regulatory networks. IEEE Trans. Neural Netw. 22, 714–726 (2011)
Eshaghnezhad, M., Effati, S., Mansoori, A.: A neurodynamic model to solve nonlinear pseudo-monotone projection equation and its applications. IEEE Trans. Cybern. 47, 3050–3062 (2016)
Xia, Y., Chen, T., Shan, J.: A novel iterative method for computing generalized inverse. Neural Comput. 26(2), 449–465 (2014)
Bertsekas, D.P., Tsitsiklis, J.N.: Parallel and Distributed Computation: Numerical Methods. Prentice Hall, Upper Saddle River (1989)
Li, Z., Cheng, H., Guo, H.: General recurrent neural network for solving generalized linear matrix equation. Complexity 3, 1–7 (2017)
Bouhamidi, A., Jbilou, K., Raydan, M.: Convex constrained optimization for large-scale generalized Sylvester equations. Comput. Optim. Appl. 48(2), 233–253 (2011)
Shi, Q.B., Xia, Y.S.: Fast multi-channel image reconstruction using a novel two-dimensional algorithm. Multimedia Tools Appl. 71, 2015–2028 (2014)
Li, J.F., Li, W., Huang, R.: An efficient method for solving a matrix least squares problem over a matrix inequality constraint. Comput. Optim. Appl. 63(2), 393–423 (2016)
Bouhamidi, A.: A Kronecker approximation with a convex constrained optimization method for blind image restoration. Optim. Lett. 6, 1251–1264 (2012)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Huang, L., Xia, Y., Zhang, S. (2018). Two Matrix-Type Projection Neural Networks for Solving Matrix-Valued Optimization Problems. In: Cheng, L., Leung, A., Ozawa, S. (eds) Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science(), vol 11302. Springer, Cham. https://doi.org/10.1007/978-3-030-04179-3_36
Download citation
DOI: https://doi.org/10.1007/978-3-030-04179-3_36
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-04178-6
Online ISBN: 978-3-030-04179-3
eBook Packages: Computer ScienceComputer Science (R0)