Advertisement

Neuromorphic Approach Sensitivity Cell Modeling and FPGA Implementation

  • Hongjie Liu
  • Antonio Rios-Navarro
  • Diederik Paul Moeys
  • Tobi Delbruck
  • Alejandro Linares-BarrancoEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10613)

Abstract

Neuromorphic engineering takes inspiration from biology to solve engineering problems using the organizing principles of biological neural computation. This field has demonstrated success in sensor based applications (vision and audition) as well in cognition and actuators. This paper is focused on mimicking an interesting functionality of the retina that is computed by one type of Retinal Ganglion Cell (RGC). It is the early detection of approaching (expanding) dark objects. This paper presents the software and hardware logic FPGA implementation of this approach sensitivity cell. It can be used in later cognition layers as an attention mechanism. The input of this hardware modeled cell comes from an asynchronous spiking Dynamic Vision Sensor, which leads to an end-to-end event based processing system. The software model has been developed in Java, and computed with an average processing time per event of 370 ns on a NUC embedded computer. The output firing rate for an approaching object depends on the cell parameters that represent the needed number of input events to reach the firing threshold. For the hardware implementation on a Spartan6 FPGA, the processing time is reduced to 160 ns/event with the clock running at 50 MHz.

Keywords

Neuromorphic engineering Event-based processing Address-Event-Representation Dynamic Vision Sensor Approach Sensitivity cell Retina Ganglion Cell 

Notes

Acknowledgments

This work has been partially supported by the Spanish government grant (with support from the European Regional Development Fund) COFNET (TEC2016-77785-P) and the European Project VISUALISE (FP7-ICT-600954). We thank Prof. Francisco Gomez-Rodriguez for his support.

References

  1. 1.
    Lichtsteiner, P., Posch, C., Delbrck, T.: A 128 x 128 120 dB 15 \(\mu \)s latency asynchronous temporal contrast vision sensor. IEEE J. Solid- State Circ. 43(2), 566–576 (2008)CrossRefGoogle Scholar
  2. 2.
    Münch, T.A., et al.: Approach sensitivity in the retina processed by a multifunctional neural circuit. Nat. Neurosci. 12(10), 1308–1316 (2009)CrossRefGoogle Scholar
  3. 3.
    Iakymchuk, T., et al.: An AER handshake-less modular infrastructure PCB with x8 2.5 Gbps LVDS serial links. In: 2014 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE (2014)Google Scholar
  4. 4.
    Linares-Barranco, A., et al.: A USB3.0 FPGA event-based filtering and tracking framework for dynamic vision sensors. In: 2015 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2417–2420 (2015)Google Scholar
  5. 5.
    Rios-Navarro, A., et al.: A 20 Mevps/32 Mev event-based USB framework for neuromorphic systems debugging. In: 2016 Second International Conference on Event-based Control, Communication, and Signal Processing (EBCCSP). IEEE (2016)Google Scholar
  6. 6.
    Moeys, D.P., et al.: Retinal ganglion cell software and FPGA model implementation for object detection and tracking. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE (2016)Google Scholar
  7. 7.
    Clady, X., et al.: Asynchronous visual event-based time-to-contact. In: Neuromorphic Engineering Systems and Applications 51. APA (2015)Google Scholar
  8. 8.
    Serrano-Gotarredona, R., et al.: CAVIAR: a 45k neuron, 5M synapse, 12G connects/s AER hardware sensory-processing-learning-actuating system for high-speed visual object recognition and tracking. IEEE Trans. Neural Netw. 20(9), 1417–1438 (2009)CrossRefGoogle Scholar
  9. 9.
    Denk, C., Llobet-Blandino, F., Galluppi, F., Plana, L.A., Furber, S., Conradt, J.: Real-time interface board for closed-loop robotic tasks on the SpiNNaker neural computing system. In: Mladenov, V., Koprinkova-Hristova, P., Palm, G., Villa, A.E.P., Appollini, B., Kasabov, N. (eds.) ICANN 2013. LNCS, vol. 8131, pp. 467–474. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-40728-4_59 CrossRefGoogle Scholar
  10. 10.
    Khan, M.M., et al.: SpiNNaker: mapping neural networks onto a massively-parallel chip multiprocessor. In: IEEE International Joint Conference on Neural Networks, 2008, IJCNN 2008. IEEE World Congress on Computational Intelligence. IEEE (2008)Google Scholar
  11. 11.
    Delbck, T., et al.: Activity-driven, event-based vision sensors. In: Proceedings of 2010 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE (2010)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Hongjie Liu
    • 1
  • Antonio Rios-Navarro
    • 2
  • Diederik Paul Moeys
    • 1
  • Tobi Delbruck
    • 1
  • Alejandro Linares-Barranco
    • 2
    Email author
  1. 1.Institute of NeuroinformaticsETHZ-UZHZurichSwitzerland
  2. 2.Robotic and Technology of Computers LabUniversity of SevilleSevillaSpain

Personalised recommendations