Skip to main content

Neurodynamical Model of the Visual Recognition of Dynamic Bodily Actions from Silhouettes

  • Conference paper
  • First Online:
Artificial Neural Networks and Machine Learning – ICANN 2023 (ICANN 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14255))

Included in the following conference series:

Abstract

For social species, including primates, the recognition of dynamic body actions is crucial for survival. However, the detailed neural circuitry underlying this process is currently not well understood. In monkeys, body-selective patches in the visual temporal cortex may contribute to this processing. We propose a physiologically-inspired neural model of the visual recognition of body movements, which combines an existing image-computable model (‘ShapeComp’) that produces high-dimensional shape vectors of object silhouettes, with a neurodynamical model that encodes dynamic image sequences exploiting sequence-selective neural fields. The model successfully classifies videos of body silhouettes performing different actions. At the population level, the model reproduces characteristics of macaque single-unit responses from the rostral dorsal bank of the Superior Temporal Sulcus (Anterior Medial Upper Body (AMUB) patch). In the presence of time gaps in the stimulus videos, the predictions made by the model match the data from real neurons. The underlying neurodynamics can be analyzed by exploiting the framework of neural field dynamics.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Amari, S.: Dynamics of pattern formation in lateral-inhibition type neural fields. Biol. Cybern. 27(2), 77–87 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  2. Bognár, A., et al.: The contribution of dynamics to macaque body and face patch responses. Neuroimage 269, 119907 (2023)

    Article  Google Scholar 

  3. Downing, P.E.: A cortical area selective for visual processing of the human body. Science 293, 2470–2473 (2001). https://doi.org/10.1126/science.1063414

  4. Fleischer, F., Caggiano, V., Thier, P., Giese, M.A.: Physiologically inspired model for the visual recognition of transitive hand actions. J. Neurosci. 33(15), 6563–6580 (2013)

    Article  Google Scholar 

  5. Geirhos, R., Rubisch, P., Michaelis, C., Bethge, M., Wichmann, F.A., Brendel, W.: ImageNet-trained CNNs are biased towards texture; increasing shape bias improves accuracy and robustness. In: International Conference on Learning Representations (2019)

    Google Scholar 

  6. Giese, M.A., Poggio, T.: Neural mechanisms for the recognition of biological movements. Nat. Rev. Neurosci. 4(3), 179–192 (2003)

    Article  Google Scholar 

  7. Gorelick, L., Blank, M., Shechtman, E., Irani, M., Basri, R.: Actions as space-time shapes. IEEE Trans. Pattern Anal. Mach. Intell. 29(12), 2247–2253 (2007)

    Article  Google Scholar 

  8. Hadjikhani, N., de Gelder, B.: Seeing fearful body expressions activates the fusiform cortex and amygdala. Curr. Biol. 13, 2201–2205 (2003). https://doi.org/10.1016/j.cub.2003.11.049

  9. Jhuang, H., Serre, T., Wolf, L., Poggio, T.: A biologically inspired system for action recognition. In: 2007 IEEE 11th International Conference on Computer Vision, pp. 1–8. IEEE (2007)

    Google Scholar 

  10. Kalfas, I., Kumar, S., Vogels, R.: Shape selectivity of middle superior temporal sulcus body patch neurons. ENeuro 4(3) (2017)

    Google Scholar 

  11. Landau, B., Smith, L.B., Jones, S.S.: The importance of shape in early lexical learning. Cogn. Dev. 3(3), 299–321 (1988)

    Article  Google Scholar 

  12. Lange, J., Lappe, M.: A model of biological motion perception from configural form cues. J. Neurosci. 26(11), 2894–2906 (2006)

    Article  Google Scholar 

  13. Morgenstern, Y., et al.: An image-computable model of human visual shape similarity. PLoS Comput. Biol. 17(6), e1008981 (2021)

    Article  Google Scholar 

  14. Oram, M., Perrett, D.: Responses of anterior superior temporal polysensory (STPa) neurons to “Biological Motion” stimuli. J. Cogn. Neurosci. 6(2), 99–116 (1994)

    Google Scholar 

  15. Oram, M., Perrett, D.: Integration of form and motion in the anterior superior temporal polysensory area (STPa) of the Macaque monkey. J. Neurophysiol. 76(1), 109–129 (1996)

    Article  Google Scholar 

  16. Parisi, G.I., Tani, J., Weber, C., Wermter, S.: Emergence of multimodal action representations from neural network self-organization. Cogn. Syst. Res. 43, 208–221 (2017)

    Article  Google Scholar 

  17. Peelen, M.V., Downing, P.E.: Selectivity for the human body in the fusiform gyrus. J. Neurophysiol. 93, 603–608 (2005). https://doi.org/10.1152/jn.00513.2004

  18. Popivanov, I.D., Jastorff, J., Vanduffel, W., Vogels, R.: Stimulus representations in body-selective regions of the Macaque cortex assessed with event-related fMRI. Neuroimage 63(2), 723–741 (2012)

    Article  Google Scholar 

  19. Popivanov, I.D., Jastorff, J., Vanduffel, W., Vogels, R.: Tolerance of Macaque middle STS body patch neurons to shape-preserving stimulus transformations. J. Cogn. Neurosci. 27(5), 1001–1016 (2015)

    Article  Google Scholar 

  20. Simonyan, K., Zisserman, A.: Two-stream convolutional networks for action recognition in videos. In: Advances in Neural Information Processing Systems, vol. 27 (2014)

    Google Scholar 

  21. Tsao, D.Y., Freiwald, W.A., Knutsen, T.A., Mandeville, J.B., Tootell, R.B.H.: Faces and objects in Macaque cerebral cortex. Nat. Neurosci. 6, 989–995 (2003). https://doi.org/10.1038/nn1111

  22. Vangeneugden, J., De Maziere, P.A., Van Hulle, M.M., Jaeggli, T., Van Gool, L., Vogels, R.: Distinct mechanisms for coding of visual actions in Macaque temporal cortex. J. Neurosci. 31(2), 385–401 (2011)

    Article  Google Scholar 

  23. Vogels, R.: More than the face: representations of bodies in the inferior temporal cortex. Annu. Rev. Vis. Sci. 8, 383–405 (2022)

    Article  Google Scholar 

  24. Xie, X., Giese, M.A.: Nonlinear dynamics of direction-selective recurrent neural media. Phys. Rev. E 65(5), 051904 (2002)

    Article  MathSciNet  Google Scholar 

  25. Yamins, D.L., Hong, H., Cadieu, C.F., Solomon, E.A., Seibert, D., DiCarlo, J.J.: Performance-optimized hierarchical models predict neural responses in higher visual cortex. Proc. Natl. Acad. Sci. 111(23), 8619–8624 (2014)

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by ERC 2019-SyG-RELEVANCE-856495; SSTeP-KiZ BMG:ZMWI1-2520DAT700.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Prerana Kumar .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kumar, P. et al. (2023). Neurodynamical Model of the Visual Recognition of Dynamic Bodily Actions from Silhouettes. In: Iliadis, L., Papaleonidas, A., Angelov, P., Jayne, C. (eds) Artificial Neural Networks and Machine Learning – ICANN 2023. ICANN 2023. Lecture Notes in Computer Science, vol 14255. Springer, Cham. https://doi.org/10.1007/978-3-031-44210-0_43

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-44210-0_43

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-44209-4

  • Online ISBN: 978-3-031-44210-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics