Abstract
Most modern methods for training artificial neural networks are based on error backpropagation algorithm. However it has several drawbacks. It is not biology-plausible. It needs computing and propagating derivative of error function. It can't be directly applied to binary neurons. On the other side, algorithms based on Hebb's rule offer more biology-plausible local learning methods. But they basically unsupervised and can't be directly applied to vast amount of tasks, designed for supervised learning. There were several attempts to adapt Hebb's rule for supervised learning, but these algorithms didn't become very widespread. We propose another hybrid method, which use locally available information of neuron activity, but also utilize the information about error. In contrast to other methods, the presence of error doesn't invert the direction of weight change, but affects the learning rate. We test the proposed learning method on symmetry detection task. This task is characterized by practically infinite number of training samples, so methods should demonstrate a generalization ability to solve it. We compare the obtained results with ones, obtained in our previous work.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature 323, 533–534 (1986)
Lillicrap T., Cownden D., Tweed D.B., Akerman C.J.: Random feedback weights support learning in deep neural networks (2014). arXiv: 1411.0247
Wang, L., Li, H., Duan, S., Huang, T., Wang, H.: Pavlov associative memory in a memristive neural network and its circuit implementation. Neurocomputing 171, 23–29 (2016)
Nokland, A.: Direct feedback alignment provides learning in deep neural networks (2016). arXiv: 160901596
Dunin-Barkowski, W.L., Solovyeva, K.P.: Pavlov principle and brain reverse engineering. In: 2018 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology, Saint Lois, Missouri, USA, 30 May–2 June 2018, Paper #37, pp. 1–5 (2018)
Baldi, P., Sadowski, P.: A theory of local learning, the learning channel, and the optimality of backpropagation (2015). arXiv: 1506.06472v2
Gerstner, W., Kistler, W.M.: Mathematical formulations of Hebbian learning. Biol. Cybern. 87, 404–415 (2002)
Kuriscak, E., et al.: Biological context of Hebb learning in artificial neural networks, a review. Neurocomputing 157, 22–23 (2005)
Negrov, D., Karandashev, I., Shakirov, VYu., Matveyeva, Yu., Dunin-Barkowski, W., Zenkevich, A.: An approximate backpropagation learning rule for memristor based neural networks using synaptic plasticity. Neurocomputing 237, 193–199 (2017)
Kryzhanovskiy, V., Malsagov, M.: Increase of the speed of operation of scalar neural network tree when solving the nearest neighbor search problem in binary space of large dimension. Opt. Memory Neural Netw. (Inf. Opt.) 25(2), 59–71 (2016)
Hopfield, J., Krotov, D.: Unsupervised Learning by Competing Hidden Units (2018). arXiv: 1806.10181v1
Mazzoni, P., Andersen, R., Jordan, M.: A more biologically plausible learning rule for neural networks. Proc. Nat. Acad. Sci. USA 88, 4433–4437 (1991)
Sejnowski, T.J., Kienker, P.K., Hinton, G.E.: Learning symmetry groups with hidden units: beyond the perceptron. Physica 22, 260–275 (1986) (North-Holland, Amsterdam)
Lebedev, A.E., Solovyeva, K.P., Dunin-Barkowski, W.L.: The large-scale symmetry learning applying Pavlov principle. In: Advances in Neural Computation, Machine Learning, and Cognitive Research III, Springer, Switzerland (2020)
Acknowledgements
The work financially supported by State Program of SRISA RAS No. 0065–2019-0003 (AAA-A19–119011590090-2).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Lebedev, A., Dorofeev, V., Shakirov, V. (2021). Symmetry Learning Using Non-traditional Biologically Plausible Learning Method. In: Kryzhanovsky, B., Dunin-Barkowski, W., Redko, V., Tiumentsev, Y. (eds) Advances in Neural Computation, Machine Learning, and Cognitive Research IV. NEUROINFORMATICS 2020. Studies in Computational Intelligence, vol 925. Springer, Cham. https://doi.org/10.1007/978-3-030-60577-3_13
Download citation
DOI: https://doi.org/10.1007/978-3-030-60577-3_13
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-60576-6
Online ISBN: 978-3-030-60577-3
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)