An analog neural network approach for the least absolute shrinkage and selection operator problem
- 131 Downloads
This paper addresses the analog optimization for non-differential functions. The Lagrange programming neural network (LPNN) approach provides us a systematic way to build analog neural networks for handling constrained optimization problems. However, its drawback is that it cannot handle non-differentiable functions. In compressive sampling, one of the optimization problems is least absolute shrinkage and selection operator (LASSO), where the constraint is non-differentiable. This paper considers the hidden state concept from the local competition algorithm to formulate an analog model for the LASSO problem. Hence, the non-differentiable limitation of LPNN can be overcome. Under some conditions, at equilibrium, the network leads to the optimal solution of the LASSO. Also, we prove that these equilibrium points are stable. Simulation study illustrates that the proposed analog model and the traditional digital method have the similar mean squared performance.
KeywordsAnalog neural network Neural dynamics LPNN Local competition algorithm
This work is partially supported by the Research Grants Council, Hong Kong, under Grant Number, CityU 115612.
Compliance with ethical standards
Conflict of interest
Authors declare that they do not have any commercial or associative interest that represents a conflict of interest in connection with the work submitted.
- 3.Hopfield JJ (1982) Neural networks and physical systems with emergent collective computational abilities. In: Proceedings of the National Academy of Sciences, 79, 2554–2558Google Scholar
- 12.Lin YL, Hsieh JG, Kuo YS, Jeng JH (2016) NXOR- or XOR-based robust template decomposition for cellular neural networks implementing an arbitrary Boolean function via support vector classifiers. Neural Computing Appl (accepted)Google Scholar
- 24.Gilbert AC, Tropp JA (2005) Applications of sparse approximation in communications. In: Proceedings of the international symposium on information theory ISIT 2005:1000–1004Google Scholar
- 25.Sahoo SK, Lu W(2011) Image denoising using sparse approximation with adaptive window selection. In: Proceedings of the 8th international conference on information, communications and signal processing (ICICS) 2011, 1–5Google Scholar
- 28.Saunders MA (2005) Matlab software for convex optimization. http://www.stanford.edu/group/SOL/software/pdco.html
- 32.Berg E, Friedlander MP (2007) SPGL1: a solver for large-scale sparse reconstruction. http://www.cs.ubc.ca/labs/scl/spgl1
- 35.Feng R, Lee CM, Leung CS (2015) Lagrange programming neural network for the L1-norm constrained quadratic minimization. In: Proceedings of the ICONIP 2015, Istanbul, Turkey, 3, pp 119–126Google Scholar
- 36.Balavoine A, Rozell CJ, Romberg J (2011) Global convergence of the locally competitive algorithm. In: Proceedings of the IEEE signal processing education workshop (DSP/SPE) (2011) Sedona. Arizona, USA, pp 431–436Google Scholar
- 38.Gordon G, Tibshirani R (2012) Karush–Kuhn–Tucker conditions, Optimization Fall 2012 Lecture NotesGoogle Scholar