Abstract
This paper focuses on the generalization ability of a dendritic neuron model (a model of a simple neural network). The considered model is an extension of the Hodgkin-Huxley model. The Markov kinetic schemes have been used in the mathematical description of the model, while the Lagrange multipliers method has been applied to train the model. The generalization ability of the model is studied using a method known from the regularization theory, in which a regularizer is added to the neural network error function. The regularizers in the form of the sum of squared weights of the model (the penalty function), a linear differential operator related to the input-output mapping (the Tikhonov functional), and the square norm of the network curvature are applied in the study. The influence of the regularizers on the training process and its results are illustrated with the problem of noise reduction in images of electronic components. Several metrics are used to compare results obtained for different regularizers.
Article PDF
Similar content being viewed by others
References
Avriel, M.: Nonlinear Programming: Analysis and Methods, 2nd edn. Dover Pubn Inc., USA (2003)
Bishop, C.: Pattern Recognition and Machine Learning. Springer, New York (2006)
Bower, J.M., Beeman, D.: The Book of GENESIS, Exploring Realistic Neural Models with the GEneral NEural SImulation System. Internet Edition (2003)
Cox, S.J., Griffith, B.E.: A fast, fully implicit backward Euler solver for dendritic neurons. Technical report, Rice University Department of Computational and Applied Mathematics (2000)
Destexhe, A., Mainen, Z.F., Sejnowski, T.J.: Synthesis of models for excitable membranes, synaptic transmission and neuromodulation using a common kinetic formalism. J. Comput. Neurosci. 3(1), 195–230 (1994)
Galicki, M., Leistritz, L., Zwick, E.B., Witte, H.: Improving generalization capabilities of dynamic neural networks. Neural Comput. 16, 1253–1282 (2004)
Haykin, S.O.: Neural Networks: A Comprehensive Foundation, 2nd edn. Pearson Education, London (1998)
Hodgkin, A.L., Huxley, A.F.: A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. 117, 500–544 (1952)
Koch, C.: Biophysics of Computation: Information Processing in Single Neurons. Chapter VI. Oxford University Press, New York (1999)
Krogh, A., Hertz, J.A.: A simple weight decay can improve generalization. In: Advances in Neural Information Processing Systems 4, pp. 950–957. Morgan Kaufmann (1992)
Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer, Berlin (2000)
Schneidman, E., Segev, I., Tishby, N.: Information capacity and robustness of stochastic neuron models. In: Advances in Neural Information Processing, vol. 12 (2000)
Świetlicka, A.: Trained stochastic model of biological neural network used in image processing task. Appl. Math. Comput. 267, 716–726 (2015)
Świetlicka, A., Gugała, K., Jurkowlaniec, A., Śniatała, P., Rybarczyk, A.: The stochastic, Markovian, Hodgkin-Huxley type of mathematical model of the neuron. Neural Network World 25(1), 219–239 (2015)
Świetlicka, A., Gugała, K., Pedrycz, W., Rybarczyk, A.: Development of the deterministic and stochastic Markovian model of a dendritic neuron. Bioprocess Biosyst. Eng. 37, 201–216 (2017)
Świetlicka, A., Kolanowski, K., Kapela, R., Galicki, M., Rybarczyk, A.: Investigation of generalization ability of a trained stochastic kinetic model of neuron. Appl. Math. Comput. 319, 115–124 (2018)
Acknowledgements
I would like to offer my special thanks to Krzysztof Kolanowski for taking Figs. 2 and 3(A1), and to Agata Jurkowlaniec for preparing the images so that they could be used to train the biological neural network model.
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by: Pavel Solin
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
OpenAccess This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Świetlicka, A. Regularization theory in the study of generalization ability of a biological neural network model. Adv Comput Math 45, 1793–1805 (2019). https://doi.org/10.1007/s10444-018-09658-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10444-018-09658-6
Keywords
- Kinetic model of neuron
- Markov kinetic schemes
- Lagrange multipliers
- Generalization ability
- Image processing
- Noise reduction