Skip to main content

Symmetry Learning Using Non-traditional Biologically Plausible Learning Method

  • Conference paper
  • First Online:
Advances in Neural Computation, Machine Learning, and Cognitive Research IV (NEUROINFORMATICS 2020)

Part of the book series: Studies in Computational Intelligence ((SCI,volume 925))

Included in the following conference series:

Abstract

Most modern methods for training artificial neural networks are based on error backpropagation algorithm. However it has several drawbacks. It is not biology-plausible. It needs computing and propagating derivative of error function. It can't be directly applied to binary neurons. On the other side, algorithms based on Hebb's rule offer more biology-plausible local learning methods. But they basically unsupervised and can't be directly applied to vast amount of tasks, designed for supervised learning. There were several attempts to adapt Hebb's rule for supervised learning, but these algorithms didn't become very widespread. We propose another hybrid method, which use locally available information of neuron activity, but also utilize the information about error. In contrast to other methods, the presence of error doesn't invert the direction of weight change, but affects the learning rate. We test the proposed learning method on symmetry detection task. This task is characterized by practically infinite number of training samples, so methods should demonstrate a generalization ability to solve it. We compare the obtained results with ones, obtained in our previous work.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature 323, 533–534 (1986)

    Article  Google Scholar 

  2. Lillicrap T., Cownden D., Tweed D.B., Akerman C.J.: Random feedback weights support learning in deep neural networks (2014). arXiv: 1411.0247

  3. Wang, L., Li, H., Duan, S., Huang, T., Wang, H.: Pavlov associative memory in a memristive neural network and its circuit implementation. Neurocomputing 171, 23–29 (2016)

    Article  Google Scholar 

  4. Nokland, A.: Direct feedback alignment provides learning in deep neural networks (2016). arXiv: 160901596

  5. Dunin-Barkowski, W.L., Solovyeva, K.P.: Pavlov principle and brain reverse engineering. In: 2018 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology, Saint Lois, Missouri, USA, 30 May–2 June 2018, Paper #37, pp. 1–5 (2018)

    Google Scholar 

  6. Baldi, P., Sadowski, P.: A theory of local learning, the learning channel, and the optimality of backpropagation (2015). arXiv: 1506.06472v2

  7. Gerstner, W., Kistler, W.M.: Mathematical formulations of Hebbian learning. Biol. Cybern. 87, 404–415 (2002)

    Article  Google Scholar 

  8. Kuriscak, E., et al.: Biological context of Hebb learning in artificial neural networks, a review. Neurocomputing 157, 22–23 (2005)

    Google Scholar 

  9. Negrov, D., Karandashev, I., Shakirov, VYu., Matveyeva, Yu., Dunin-Barkowski, W., Zenkevich, A.: An approximate backpropagation learning rule for memristor based neural networks using synaptic plasticity. Neurocomputing 237, 193–199 (2017)

    Article  Google Scholar 

  10. Kryzhanovskiy, V., Malsagov, M.: Increase of the speed of operation of scalar neural network tree when solving the nearest neighbor search problem in binary space of large dimension. Opt. Memory Neural Netw. (Inf. Opt.) 25(2), 59–71 (2016)

    Article  Google Scholar 

  11. Hopfield, J., Krotov, D.: Unsupervised Learning by Competing Hidden Units (2018). arXiv: 1806.10181v1

  12. Mazzoni, P., Andersen, R., Jordan, M.: A more biologically plausible learning rule for neural networks. Proc. Nat. Acad. Sci. USA 88, 4433–4437 (1991)

    Article  Google Scholar 

  13. Sejnowski, T.J., Kienker, P.K., Hinton, G.E.: Learning symmetry groups with hidden units: beyond the perceptron. Physica 22, 260–275 (1986) (North-Holland, Amsterdam)

    MathSciNet  Google Scholar 

  14. Lebedev, A.E., Solovyeva, K.P., Dunin-Barkowski, W.L.: The large-scale symmetry learning applying Pavlov principle. In: Advances in Neural Computation, Machine Learning, and Cognitive Research III, Springer, Switzerland (2020)

    Google Scholar 

Download references

Acknowledgements

The work financially supported by State Program of SRISA RAS No. 0065–2019-0003 (AAA-A19–119011590090-2).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alexander Lebedev .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lebedev, A., Dorofeev, V., Shakirov, V. (2021). Symmetry Learning Using Non-traditional Biologically Plausible Learning Method. In: Kryzhanovsky, B., Dunin-Barkowski, W., Redko, V., Tiumentsev, Y. (eds) Advances in Neural Computation, Machine Learning, and Cognitive Research IV. NEUROINFORMATICS 2020. Studies in Computational Intelligence, vol 925. Springer, Cham. https://doi.org/10.1007/978-3-030-60577-3_13

Download citation

Publish with us

Policies and ethics