Neuromorphic Approach Sensitivity Cell Modeling and FPGA Implementation
Neuromorphic engineering takes inspiration from biology to solve engineering problems using the organizing principles of biological neural computation. This field has demonstrated success in sensor based applications (vision and audition) as well in cognition and actuators. This paper is focused on mimicking an interesting functionality of the retina that is computed by one type of Retinal Ganglion Cell (RGC). It is the early detection of approaching (expanding) dark objects. This paper presents the software and hardware logic FPGA implementation of this approach sensitivity cell. It can be used in later cognition layers as an attention mechanism. The input of this hardware modeled cell comes from an asynchronous spiking Dynamic Vision Sensor, which leads to an end-to-end event based processing system. The software model has been developed in Java, and computed with an average processing time per event of 370 ns on a NUC embedded computer. The output firing rate for an approaching object depends on the cell parameters that represent the needed number of input events to reach the firing threshold. For the hardware implementation on a Spartan6 FPGA, the processing time is reduced to 160 ns/event with the clock running at 50 MHz.
KeywordsNeuromorphic engineering Event-based processing Address-Event-Representation Dynamic Vision Sensor Approach Sensitivity cell Retina Ganglion Cell
This work has been partially supported by the Spanish government grant (with support from the European Regional Development Fund) COFNET (TEC2016-77785-P) and the European Project VISUALISE (FP7-ICT-600954). We thank Prof. Francisco Gomez-Rodriguez for his support.
- 3.Iakymchuk, T., et al.: An AER handshake-less modular infrastructure PCB with x8 2.5 Gbps LVDS serial links. In: 2014 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE (2014)Google Scholar
- 4.Linares-Barranco, A., et al.: A USB3.0 FPGA event-based filtering and tracking framework for dynamic vision sensors. In: 2015 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2417–2420 (2015)Google Scholar
- 5.Rios-Navarro, A., et al.: A 20 Mevps/32 Mev event-based USB framework for neuromorphic systems debugging. In: 2016 Second International Conference on Event-based Control, Communication, and Signal Processing (EBCCSP). IEEE (2016)Google Scholar
- 6.Moeys, D.P., et al.: Retinal ganglion cell software and FPGA model implementation for object detection and tracking. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE (2016)Google Scholar
- 7.Clady, X., et al.: Asynchronous visual event-based time-to-contact. In: Neuromorphic Engineering Systems and Applications 51. APA (2015)Google Scholar
- 9.Denk, C., Llobet-Blandino, F., Galluppi, F., Plana, L.A., Furber, S., Conradt, J.: Real-time interface board for closed-loop robotic tasks on the SpiNNaker neural computing system. In: Mladenov, V., Koprinkova-Hristova, P., Palm, G., Villa, A.E.P., Appollini, B., Kasabov, N. (eds.) ICANN 2013. LNCS, vol. 8131, pp. 467–474. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-40728-4_59 CrossRefGoogle Scholar
- 10.Khan, M.M., et al.: SpiNNaker: mapping neural networks onto a massively-parallel chip multiprocessor. In: IEEE International Joint Conference on Neural Networks, 2008, IJCNN 2008. IEEE World Congress on Computational Intelligence. IEEE (2008)Google Scholar
- 11.Delbck, T., et al.: Activity-driven, event-based vision sensors. In: Proceedings of 2010 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE (2010)Google Scholar