MCP Based Noise Resistant Algorithm for Training RBF Networks and Selecting Centers
In the implementation of a neural network, some imperfect issues, such as precision error and thermal noise, always exist. They can be modeled as multiplicative noise. This paper studies the problem of training RBF network and selecting centers under multiplicative noise. We devise a noise resistant training algorithm based on the alternating direction method of multipliers (ADMM) framework and the minimax concave penalty (MCP) function. Our algorithm first uses all training samples to create the RBF nodes. Afterwards, we derive the training objective function that can tolerate to the existence of noise. Finally, we add a MCP term to the objective function. We then apply the ADMM framework to minimize the modified objective function. During training, the MCP term has an ability to make some unimportant RBF weights to zero. Hence training and RBF node selection can be done at the same time. The proposed algorithm is called the ADMM-MCP algorithm. Also, we present the convergent properties of the ADMM-MCP algorithm. From the simulation result, the ADMM-MCP algorithm is better than many other RBF training algorithms under weight/node noise situation.
KeywordsRBF Center selection ADMM MCP Multiplicative noise
The work was supported by a research grant from City University of Hong Kong (7004842).
- 4.Burr, J.B.: Digital neural network implementations. In: Neural Networks, Concepts, Applications, and Implementations, vol. 3, pp. 237–285. Prentice Hall (1995)Google Scholar
- 13.Wang, Y., Yin, W., Zeng, J.: Global convergence of ADMM in nonconvex nonsmooth optimization. J. Sci. Comput. (2015, accepted)Google Scholar
- 14.Lichman, M.: UCI machine learning repository (2013)Google Scholar
- 16.Malioutov, D.M., Cetin, M., Willsky, A.S.: Homotopy continuation for sparse signal representation. In: Proceedings of the IEEE CASSP 2005, vol. 5, pp. 733–736. IEEE Press, New York (2005)Google Scholar