Bio-Inspired Optic Flow from Event-Based Neuromorphic Sensor Input

  • Stephan Tschechne
  • Roman Sailer
  • Heiko Neumann
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8774)


Computational models of visual processing often use frame-based image acquisition techniques to process a temporally changing stimulus. This approach is unlike biological mechanisms that are spike-based and independent of individual frames. The neuromorphic Dynamic Vision Sensor (DVS) [Lichtsteiner et al., 2008] provides a stream of independent visual events that indicate local illumination changes, resembling spiking neurons at a retinal level. We introduce a new approach for the modelling of cortical mechanisms of motion detection along the dorsal pathway using this type of representation. Our model combines filters with spatio-temporal tunings also found in visual cortex to yield spatio-temporal and direction specificity. We probe our model with recordings of test stimuli, articulated motion and ego-motion. We show how our approach robustly estimates optic flow and also demonstrate how this output can be used for classification purposes.


Event-Vision Optic Flow Neural Model Classification 


  1. [Adelson and Bergen, 1991]
    Adelson, E.H., Bergen, J.H.: The plenoptic function and the elements of early vision. In: Landy, M., Movshon, J.A. (eds.) Computational Models of Visual Processing, pp. 3–20. MIT Press (1991)Google Scholar
  2. [Adelson and Bergen, 1985]
    Adelson, E.H., Bergen, J.R.: Spatiotemporal energy models for the perception of motion. J. Opt. Soc. Am. 2(2), 284–299 (1985)CrossRefGoogle Scholar
  3. [Bayerl and Neumann, 2004]
    Bayerl, P., Neumann, H.: Disambiguating visual motion through contextual feedback modulation. Neural Computation 16(10), 2041–2066 (2004)CrossRefzbMATHGoogle Scholar
  4. [Benosman et al., 2014]
    Benosman, R., Clercq, C., Lagorce, X., Ieng, S.H., Bartolozzi, C.: Event-Based visual flow. IEEE Trans. on Neural Networks and Learning Systems 25(2), 407–417 (2014)CrossRefGoogle Scholar
  5. [Benosman et al., 2012]
    Benosman, R., Ieng, S.-H., Clercq, C., Bartolozzi, C., Srinivasan, M.: Asynchronous frameless event–based optical flow. Neural Networks 27, 32–37 (2012)CrossRefGoogle Scholar
  6. [Brox et al., 2004]
    Brox, T., Bruhn, A., Papenberg, N., Weickert, J.: High Accuracy Optical Flow Estimation Based on a Theory for Warping. In: Pajdla, T., Matas, J(G.) (eds.) ECCV 2004. LNCS, vol. 3024, pp. 25–36. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  7. [De Valois et al., 2000]
    De Valois, R., Cottaris, N.P., Mahon, L.E., Elfar, S.D., Wilson, J.A.: Spatial and temporal receptive fields of geniculate and cortical cells and directional selectivity. Vision Research 40, 3685–3702 (2000)CrossRefGoogle Scholar
  8. [Delbrück, 2012]
    Delbruck, T.: Fun with asynchronous vision sensors and processing. In: Fusiello, A., Murino, V., Cucchiara, R. (eds.) ECCV 2012 Ws/Demos, Part I. LNCS, vol. 7583, pp. 506–515. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  9. [Drazen et al., 2011]
    Drazen, D., Lichtsteiner, P., Häflinger, P., Delbrück, T., Jensen, A.: Toward real-time particle tracking using an event-based dynamic vision sensor. Exp. Fluids 51, 1465–1469 (2011)CrossRefGoogle Scholar
  10. [Escobar and Kornprobst, 2012]
    Escobar, M.J., Kornprobst, P.: Action recognition via bio–inspired features: The richness of center-surround interaction. Computer Vision and Image Understanding 116(5), 593–605 (2012)CrossRefGoogle Scholar
  11. [Fu et al., 2008]
    Fu, Z., Delbrück, T., Lichtsteiner, P., Culurciello, E.: An address-event fall detector for assisted living applications. IEEE Trans. on Biomedical Circuits and Systems 2(2), 88–96 (2008)CrossRefGoogle Scholar
  12. [Hassenstein and Reichard, 1956]
    Hassenstein, B., Reichard, W.: Functional structure of a mechanism of perception of optical movement. In: Proc. 1st Intl. Congress on Cybernetics, pp. 797–801 (1956)Google Scholar
  13. [Lichtsteiner et al., 2008]
    Lichtsteiner, P., Posch, C., Delbrück, T.: A 128×128 120dB 15μs Latency Asychronous Temporal Contrast Vision Sensor. IEEE Journal of Solid–State Circuits 43(2), 566–576 (2008)CrossRefGoogle Scholar
  14. [Liu and Delbrück, 2010]
    Liu, S.C., Delbrück, T.: Neuromorphic Sensory Systems. Current Opinion in Neurobiology 20, 288–295 (2010)CrossRefGoogle Scholar
  15. [Lucas and Kanade, 1981]
    Lucas, B., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: Proceedings of the 7th International Joint Conference on Artificial Intelligence, IJCAI, pp. 674–679 (1981)Google Scholar
  16. [Rogister et al., 2011]
    Rogister, P., Benosman, R., Ieng, S.H., Posch, C.: Asynchronous event–based binocular stereo matching. IEEE Transactions on Neural Networks 22(11), 1723–1734 (2011)CrossRefGoogle Scholar
  17. [Ungerleider and Haxby, 1994]
    Ungerleider, L.G., Haxby, J.V.: ‘what’ and ‘where’ in the human brain. Current Opinion in Neurobiology 4(2), 157–165 (1994)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Stephan Tschechne
    • 1
  • Roman Sailer
    • 1
  • Heiko Neumann
    • 1
  1. 1.Inst. for Neural Information ProcessingUlm UniversityUlmGermany

Personalised recommendations