A Biomimetic Frame-Free Event-Driven Image Sensor

  • Christoph Posch


Conventional image sensors acquire the visual information time-quantized at a predetermined frame rate. Each frame carries the information from all pixels, regardless of whether or not this information has changed since the last frame had been acquired. If future artificial vision systems are to succeed in demanding applications such as autonomous robot navigation, high-speed motor control and visual feedback loops, they must exploit the power of the biological, asynchronous, frame-free approach to vision and leave behind the unnatural limitation of frames: These vision systems must be driven and controlled by events happening within the scene in view, and not by artificially created timing and control signals that have no relation whatsoever to the source of the visual information: the world. Translating the frameless paradigm of biological vision to artificial imaging systems implies that control over visual information acquisition is no longer being imposed externally to an array of pixels but the decision making is transferred to the single pixel that handles its own information individually. The notion of a frame has then completely disappeared and is replaced by a spatio-temporal volume of luminance-driven, asynchronous events. ATIS is the first optical sensor to combine several functionalities of the biological ‘where’- and ‘what’-systems of the human visual system. Following its biological role model, this sensor processes the visual information in a massively parallel fashion using energy-efficient, asynchronous event-driven methods.


Sense Node Image Sensor Pulse Width Modulation Video Compression Pulse Frequency Modulation 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    W. McCulloch, W. Pitts, A logical calculus of the ideas immanent in nervous activity, Bull. Math. Bio., no. 5, pp. 115–133, 1943Google Scholar
  2. 2.
    D. Hebb, The organization of behavior. New York, NY, Wiley, 1949Google Scholar
  3. 3.
    C. Mead, Analog VLSI and neural systems. NY, Addison-Wesley, 1989MATHCrossRefGoogle Scholar
  4. 4.
    C. Mead, Neuromorphic electronic systems, Proc. IEEE, vol. 78, no. 10 pp. 1629–1636, 1990Google Scholar
  5. 5.
    M.A.C. Maher, S.P. Deweerth, M.A. Mahowald, C.A. Mead, Implementing neural architectures using analog VLSI circuits, IEEE Trans. Circ. Syst., vol. 36, no. 5, pp. 643–652, 1989CrossRefGoogle Scholar
  6. 6.
    M.A. Mahowald, C.A. Mead, The Silicon Retina, Scientific American, May 1991Google Scholar
  7. 7.
    K. Boahen, Neuromorphic Microchips, Sci. Am., vol. 292, pp. 55–63, 2005CrossRefGoogle Scholar
  8. 8.
    A.H.C. Van Der Heijden, Selective attention in vision, ISBN: 0415061059, New York, Routledge, 1992Google Scholar
  9. 9.
    P. Lichtsteiner, T. Delbruck, A 64 ×64 AER logarithmic temporal derivative silicon retina, Research in Microelectronics and Electronics, 2005 PhD, vol. 2, pp. 202–205, 25–28 July 2005Google Scholar
  10. 10.
    P. Lichtsteiner, C. Posch, T. Delbruck, A 128 ×128 120 dB 15 μs latency asynchronous temporal contrast vision sensor, IEEE J. Solid-State Circ., vol. 43, no. 2, pp. 566–576, 2008CrossRefGoogle Scholar
  11. 11.
    P. Lichtsteiner, C. Posch, T. Delbruck, A 128 ×128 120dB 30mW asynchronous vision sensor that responds to relative intensity change, ISSCC, 2006, Dig. of Tech. Papers, pp. 2060–2069, 6–9 Feb 2006Google Scholar
  12. 12.
    F.W. Mounts, A video coding system with conditional picture-element replenishment, BSTJ, pp. 2545–2554, 1969Google Scholar
  13. 13.
    Y. Chin, T. Berger, A software-only videocodec using pixelwise conditional differential replenishment and perceptual enhancements, IEEE Trans. Circ. Syst. Video Tech., vol. 9, no. 3, pp. 438–450, 1999CrossRefGoogle Scholar
  14. 14.
    K. Aizawa, Y. Egi, T. Hamamoto, M. Hatori, M. Abe, H. Maruyama, H. Otake, Computational image sensor for on sensor compression, IEEE Trans. Electr. Devices, vol. 44, no. 10, pp. 1724–1730, 1997CrossRefGoogle Scholar
  15. 15.
    V. Gruev, R. Etienne-Cummings, A pipelined temporal difference imager, IEEE J Solid-State Circ., vol. 39, no. 3, pp. 538–543, 2004CrossRefGoogle Scholar
  16. 16.
    J. Yuan, Y.C. Ho, S.W. Fung, B. Liu, An activity-triggered 95.3 dB DR – 75.6 dB THD CMOS imaging sensor with digital calibration, IEEE J. Solid-State Circ., vol. 44, no. 10, pp. 2834–2843, 2009CrossRefGoogle Scholar
  17. 17.
    Y.M. Chi, U. Mallik, M.A. Clapp, E. Choi, G. Cauwenberghs, R., Etienne-Cummings, CMOS camera with in-pixel temporal change detection and ADC, IEEE J. Solid-State Circ., vol. 42, no. 10, pp. 2187–2196, 2007CrossRefGoogle Scholar
  18. 18.
    K. Boahen, A burst-mode word-serial address-event link-I: transmitter design, IEEE Trans. Circ. Syst. I, vol. 51, no. 7, pp. 1269–1280, 2004CrossRefGoogle Scholar
  19. 19.
    K. Boahen, Point-to-point connectivity between neuromorphic chips using address events, IEEE Trans. Circ. Syst. II, vol. 47, no. 5, pp. 416–434, 2000MATHCrossRefGoogle Scholar
  20. 20.
    G. Ward, The hopeful future of high dynamic range imaging, 2007 SID International Symposium, 22–25 May 2007Google Scholar
  21. 21.
    S.J. Decker, R.D. McGrath, K. Brehmer, C.G. Sodini, A 256 ×256 CMOS imaging array with wide dynamic range pixels and column-parallel digital output, IEEE J. Solid State Circ., vol. 33, pp. 2081–2091, 1998CrossRefGoogle Scholar
  22. 22.
    T. Lulé, B. Schneider, M. Böhm, Design and fabrication of a high dynamic range image sensor in TFA technology, IEEE J. Solid State Circ., vol. 34, pp. 704–711, 1999CrossRefGoogle Scholar
  23. 23.
    D.X.D. Yang, A. El Gamal, B. Fowler, H. Tian, A 640 ×512 CMOS image sensor with ultrawide dynamic range floating-point pixellevel ADC, IEEE J. Solid State Circ., vol. 34, pp. 1821–1834, 1999CrossRefGoogle Scholar
  24. 24.
    V. Brajovic, T. Kanade, A VLSI sorting image sensor: Global massively parallel intensity-to-time processing for low-latency adaptive vision, IEEE Trans. Robot. Autom., vol. 15, no. 1, 67–75, 1999CrossRefGoogle Scholar
  25. 25.
    J.-E. Eklund, C. Svensson, A. Astrom, VLSI implementation of a focal plane image processor-a realization of the near-sensor image processing concept, IEEE Trans. VLSI, vol. 4, no. 3, pp. 322–335, 1996CrossRefGoogle Scholar
  26. 26.
    M. Nagata, J. Funakoshi, A. Iwata, A PWM signal processing core circuit based on a switched current integration technique, IEEE J. Solid-State Circ., vol. 33, no. 1, pp. 53–60, 1998CrossRefGoogle Scholar
  27. 27.
    K. Frohmader, A novel MOS compatible light intensity-to-frequency converter suited for monolithic integration, IEEE J. Solid-State Circ., vol. 17, no. 3, pp. 588–591, 1982CrossRefGoogle Scholar
  28. 28.
    K. Tanaka, et al., Novel digital photosensor cell in GaAs IC using conversion of light Intensity to pulse frequency, Jpn. J. Appl. Phys., vol. 32, no. 11A, pp. 5002–5007, 1993CrossRefGoogle Scholar
  29. 29.
    W. Yang, A wide-dynamic-range, low-power photosensor array, ISSCC 1994, Dig. of Tech. Papers, pp. 230–231, 1994Google Scholar
  30. 30.
    A. Kitchen, A. Bermak, A. Bouzerdoum, A digital pixel sensor array with programmable dynamic range, IEEE Trans. Electr. Devices, vol. 52, no. 12, pp. 2591–2601, 2005CrossRefGoogle Scholar
  31. 31.
    Q. Luo, J. Harris, A time-based CMOS image sensor, IEEE International Symposium on Circuits and Systems, ISCAS 2004, vol. IV, pp. 840–843, 2004Google Scholar
  32. 32.
    G.T. Fechner, Elemente der Psychophysik, 2. Bände, Leipzig, 1860Google Scholar
  33. 33.
    T. Delbruck, C. A. Mead, Analog VLSI adaptive logarithmic wide dynamic-range photoreceptor, IEEE International Symposium on Circuits and Systems, ISCAS 1994, vol. 4, pp. 339–342, 1994Google Scholar
  34. 34.
    D. Bauer, et al., Embedded vehicle speed estimation system using an asynchronous temporal contrast vision sensor, EURASIP J. Embedded Syst., vol. 2007, doi:10.1155/2007/82174, 2007Google Scholar
  35. 35.
    C. Posch, M. Hofstätter, D. Matolin, et al., A dual-line optical transient sensor with on-chip precision time-stamp generation, ISSCC, 2007, Dig. of Tech. Papers, pp. 500–501, 11–15 Feb, 2007Google Scholar
  36. 36.
    D. Matolin, C. Posch, R. Wohlgenannt, True correlated double sampling and comparator design for time-based image sensors, IEEE International Symposium on Circuits and Systems, ISCAS 2009, pp. 1269–1272, 24–27 May 2009Google Scholar
  37. 37.
    C. Posch, D. Matolin, R. Wohlgenannt, An asynchronous time-based image sensor, IEEE International Symposium on Circuits and Systems, ISCAS 2008. pp. 2130–2133, 2008Google Scholar
  38. 38.
    X. Guo, X. Qi, J. Harris, A time-to-first-spike CMOS image sensor, IEEE Sensors J., vol. 7, no. 8, pp. 1165–1175, 2007CrossRefGoogle Scholar
  39. 39.
    D. Matolin, R. Wohlgenannt, M. Litzenberger, C. Posch, A load-balancing readout method for large event-based PWM imaging arrays, IEEE International Symposium on Circuits and Systems, ISCAS 2010. May 2010Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2011

Authors and Affiliations

  1. 1.AIT Austrian Institute of TechnologyViennaAustria

Personalised recommendations