Skip to main content

Population Coding Can Greatly Improve Performance of Neural Networks: A Comparison

  • Conference paper
  • First Online:
Artificial Neural Networks and Machine Learning – ICANN 2023 (ICANN 2023)

Abstract

Artificial neural networks oftentimes operate on continuous inputs. While biological neural networks usually represent information through the activity of a population of neurons, the inputs of an artificial neural network are typically provided as a list of scalars. As the information content of each of the input scalars depends heavily on the problem domain, representing them as individual scalar inputs, irrespective of the amount of information they contain, may prove to be suboptimal for the network. We therefore compare and examine four different Population Coding schemes and demonstrate on two toy datasets and one real world benchmark that applying Population Coding to information rich, low dimensional inputs can vastly improve a network’s performance.

M. Jahrens and H.-O. Hansen—Equal contributions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Averbeck, B., Latham, P., Pouget, A.: Neural correlations, population coding and computation. Nature Rev. Neurosci. 7, 358–66 (2006). https://doi.org/10.1038/nrn1888

  2. Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 4171–4186. Association for Computational Linguistics (2019). https://doi.org/10.18653/v1/n19-1423

  3. Georgopoulos, A., Schwartz, A., Ketiner, R.: Neuronal population coding of movement direction. Science 233, 1416–1419 (1986)

    Article  Google Scholar 

  4. Hafting, T., Fyhn, M., Molden, S.: Microstructure of a spatial map in the entorhinal cortex. Nature 436, 801–806 (2005)

    Article  Google Scholar 

  5. Jahrens, M., Martinetz, T.: Solving Raven’s progressive matrices with multi-layer relation networks. In: 2020 International Joint Conference on Neural Networks (IJCNN), pp. 1–6. IEEE (2020)

    Google Scholar 

  6. Mildenhall, B., Srinivasan, P.P., Tancik, M., Barron, J.T., Ramamoorthi, R., Ng, R.: NeRF: representing scenes as neural radiance fields for view synthesis. In: Vedaldi, A., Bischof, H., Brox, T., Frahm, J.-M. (eds.) ECCV 2020. LNCS, vol. 12346, pp. 405–421. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58452-8_24

    Chapter  Google Scholar 

  7. Tancik, M., et al.: Fourier features let networks learn high frequency functions in low dimensional domains (2020)

    Google Scholar 

  8. Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30. Curran Associates, Inc. (2017). https://proceedings.neurips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf

  9. Zell, H.: Giraffa camelopardalis reticulata 01 (2009). https://upload.wikimedia.org/wikipedia/commons/4/4d/Giraffa_camelopardalis_reticulata_01.JPG, license: CC BY-SA 3.0

  10. Zeng, A., Chen, M., Zhang, L., Xu, Q.: Are transformers effective for time series forecasting? (2022). https://doi.org/10.48550/ARXIV.2205.13504. https://arxiv.org/abs/2205.13504

  11. Zhou, H., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. In: The Thirty-Fifth AAAI Conference on Artificial Intelligence, AAAI 2021, Virtual Conference, vol. 35, pp. 11106–11115. AAAI Press (2021)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hans-Oliver Hansen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Jahrens, M., Hansen, HO., Köhler, R., Martinetz, T. (2023). Population Coding Can Greatly Improve Performance of Neural Networks: A Comparison. In: Iliadis, L., Papaleonidas, A., Angelov, P., Jayne, C. (eds) Artificial Neural Networks and Machine Learning – ICANN 2023. ICANN 2023. Lecture Notes in Computer Science, vol 14258. Springer, Cham. https://doi.org/10.1007/978-3-031-44192-9_31

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-44192-9_31

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-44191-2

  • Online ISBN: 978-3-031-44192-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics