Advertisement

An Experimental Study on the Relationships Among Neural Codes and the Computational Properties of Neural Networks

  • Sergio Miguel-ToméEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11487)

Abstract

Biological neural networks (BNNs) have inspired the creation of artificial neural networks (ANNs) [19]. One of the properties of BNNs is computational robustness, but this property is often overlooked in computer science because ANNs are usually virtualizations executed in a physical machine that lacks computational robustness. However, it was recently proposed that computational robustness could be a key feature that drives the selection of the computational model in the evolution of animals [20]. Until now, only energetic cost and processing time had been considered as the features that drove the evolution of the nervous system. The new standpoint leads us to consider whether computational robustness could have driven the evolution of not only the computational model but also other nervous system traits in animals through the process of natural selection. Because an important feature of an animal’s nervous system is its neural code, we tested the relationship among the computational properties of feed-forward neural networks and the neural codes through in silico experiments. We found two main results: There is a relationship between the number of epochs needed to train a feed-forward neural network using back-propagation and the neural code of the neural network, and a relationship exists between the computational robustness and the neural code of a feed-forward neural network. The first result is important to ANNs and the second to BNNs.

Keywords

Computational robustness Neural code Neural network Back-propagation 

References

  1. 1.
    Aggarwal, C.: Neural Networks and Deep Learning: A Textbook. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-94463-0CrossRefzbMATHGoogle Scholar
  2. 2.
    Albert, R., et al.: Error and attack tolerance of complex networks. Nature 406, 378–382 (2000)CrossRefGoogle Scholar
  3. 3.
    Antonopoulos, C., et al.: Evaluating performance of neural codes in model neural communication networks. Neural Netw. 109, 90–102 (2019)CrossRefGoogle Scholar
  4. 4.
    Beaulieu-Laroche, L., et al.: Enhanced dendritic compartmentalization in human cortical neurons. Cell 175(3), 643–651 (2018)CrossRefGoogle Scholar
  5. 5.
    Bullmore, E., Sporns, O.: The economy of brain network organization. Nat. Rev. Neurosci. 12, 336–349 (2012)CrossRefGoogle Scholar
  6. 6.
    Cherniak, C.: Component placement optimization in the brain. J. Neurosci. 14(4), 2418–2427 (1994)CrossRefGoogle Scholar
  7. 7.
    Ghosh, A., Pal, N., Pal, S.: Modeling of component failure in neural networks for robustness evaluation: an application to object extraction. IEEE Trans. Neural Netw. 6(3), 648–656 (1995)CrossRefGoogle Scholar
  8. 8.
    Ghosh, A., Tanaka, H.: On making neural network based learning systems robust. IETE J. Res. 44(4–5), 219–225 (1998)CrossRefGoogle Scholar
  9. 9.
    Guerguiev, J., et al.: Towards deep learning with segregated dendrites. eLife 6, e22901 (2017)Google Scholar
  10. 10.
    Gulyás, A.E.A.: Navigable networks as nash equilibria of navigation games. Nat. Commun. 6(7651), 1–10 (2015)Google Scholar
  11. 11.
    Kalampokis, A., et al.: Robustness in biological neural networks. Physica A: Stat. Mech. Appl. 317(3–4), 581–590 (2003)CrossRefGoogle Scholar
  12. 12.
    Kazantsev, V.B., et al.: Self-referential phase reset based on inferior olive oscillator dynamics. Proc. Nat. Acad. Sci. 101(52), 18183–18188 (2004)CrossRefGoogle Scholar
  13. 13.
    Kong, Q., et al.: Efficient coding matters in the organization of the early visual system. Neural Netw. 105, 218–226 (2018)CrossRefGoogle Scholar
  14. 14.
    Laughlin, S.B., Sejnowski, T.J.: Communication in neural networks. Science 301(5641), 1870–1874 (2003)CrossRefGoogle Scholar
  15. 15.
    Lianchun, Y., Yuguo, Y.: Energy-efficient neural information processing in individual neurons and neuronal networks. J. Neurosci. Res. 95(11), 2253–2266 (2017)CrossRefGoogle Scholar
  16. 16.
    Lucal, H.M.: Arithmetic operations for digital computers using a modified reflected binary code. IRE Trans. Electron. Comput. EC-8(4), 449–458 (1959)CrossRefGoogle Scholar
  17. 17.
    Makarenko, V., Llinás, R.: Experimentally determined chaotic phase synchronization in a neuronal system. Proc. Nat. Acad. Sci. 95(26), 15747–15752 (1998)CrossRefGoogle Scholar
  18. 18.
    Manin, Y.I.: Error-correcting codes and neural networks. Sel. Math. 24(1), 521–530 (2018)MathSciNetCrossRefGoogle Scholar
  19. 19.
    McCulloch, W., Pitts, W.: A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 5, 115–133 (1943)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Miguel-Tomé, S.: The influence of computational traits on the natural selection of the nervous system. Natural Comput. 17(2), 403–425 (2018)MathSciNetCrossRefGoogle Scholar
  21. 21.
    Moreno, H., et al.: Synaptic transmission block by presynaptic injection of oligomeric amyloid beta. Proc. Nat. Acad. Sci. 106(14), 5901–5906 (2009)CrossRefGoogle Scholar
  22. 22.
    Pryluk, R., et al.: A tradeoff in the neural code across regions and species. Cell 176(3), 597–609.e18 (2019)CrossRefGoogle Scholar
  23. 23.
    Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning internal representations by error propagation. In: Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vol. 1, pp. 318–362. MIT Press (1986)Google Scholar
  24. 24.
    Werbos, P.: Beyond regression: new tools for prediction and analysis in the behavior sciences. Doctor in philosophy, Harvard University (1974)Google Scholar
  25. 25.
    Yeung, R.: Information Theory and Network Coding. Springer, Boston (2008).  https://doi.org/10.1007/978-0-387-79234-7CrossRefzbMATHGoogle Scholar
  26. 26.
    Yuste, R.: From the neuron doctrine to neural networks. Nat. Rev. Neurosci. 16, 487–497 (2015)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.CITAI (Cluster de Invetigación en Tecnologías Aplicadas a la Innovación)Universidad Isabel IBurgosSpain

Personalised recommendations