Abstract
Visual object tracking under challenging conditions of motion and light can be hindered by the capabilities of conventional cameras, prone to producing images with motion blur. Event cameras are novel sensors suited to robustly perform vision tasks under these conditions. However, due to the nature of their output, applying them to object detection and tracking is non-trivial. In this work, we propose a framework to take advantage of both event cameras and off-the-shelf deep learning for object tracking. We show that reconstructing event data into intensity frames improves the tracking performance in conditions under which conventional cameras fail to provide acceptable results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Gallego, G., et al.: Event-based vision: a survey. IEEE Trans. Patt. Anal. Mach. Intell.44(1), 154–180 (2022)
Zhang, J., Yang, X., Fu, Y., Wei, X., Yin, B., Dong, B.: Object tracking by jointly exploiting frame and event domain. In: IEEE/CVF International Conference on Computer Vision, pp. 13043–13052 (2021)
Wang, X., et al.: VisEvent: reliable object tracking via collaboration of frame and event flows. arXiv:2108.05015 (2021)
Liu, H., Moeys, D.P., Das, G., Neil, D., Liu, S.C., Delbrück, T.: Combined frame- and event-based detection and tracking. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2511–2514 (2016)
Jiang, R., Wang, Q., Shi, S., Mou, X., Chen, S.: Flow-assisted visual tracking using event cameras. CAAI Trans. Intell. Technol. 6(2), 192–202 (2021)
Perot, E., de Tournemire, P., Nitti, D., Masci, J., Sironi, A.: Learning to detect objects with a 1 megapixel event camera. In: Advances in Neural Information Processing Systems. vol. 33, pp. 16639–16652 (2020)
Iacono, M., Weber, S., Glover, A., Bartolozzi, C.: Towards event-driven object detection with off-the-shelf deep learning. In: IEEE International Conference on Intelligent Robots and Systems, pp. 6277–6283. Institute of Electrical and Electronics Engineers Inc. (2018)
Jiang, R., et al.: Object tracking on event cameras with offline-online learning. CAAI Trans. Intell. Technol. 5(3), 165–171 (2020)
Li, H., Shi, L.: Robust event-based object tracking combining correlation filter and CNN representation. Front. Neurorobot. 13, 82 (2019). https://www.frontiersin.org/article/10.3389/fnbot.2019.00082/full
Pérez Cutiño, M., Gómez Eguíluz, A., Martinez-de Dios, J.R., Ollero, A.: Event-based human intrusion detection in UAS using deep learning. In: 2021 International Conference on Unmanned Aircraft Systems (ICUAS), pp. 91–100 (06 2021)
Rebecq, H., Ranftl, R., Koltun, V., Scaramuzza, D.: High speed and high dynamic range video with an event camera. IEEE Trans. Patt. Anal. Mach. Intell. 43(6), 1964–1980 (2021)
Scheerlinck, C., Rebecq, H., Gehrig, D., Barnes, N., Mahony, R.E., Scaramuzza, D.: Fast image reconstruction with an event camera. In: 2020 IEEE Winter Conference on Applications of Computer Vision (WACV), pp. 156–163 (2020)
Jocher, G.: YOLOv5n ‘Nano’ models, Roboflow integration, TensorFlow export. OpenCV DNN support (2021). https://doi.org/10.5281/zenodo.5563715
Söderström, T.: Discrete-Time Stochastic Systems. Springer, London (2002)
Kalman, R.E.: A new approach to linear filtering and prediction problems. J. Basic Eng. 82(1), 35–45 (1960)
Mueggler, E., Rebecq, H., Gallego, G., Delbruck, T., Scaramuzza, D.: The Event-Camera Dataset and Simulator: event-based data for pose estimation, visual odometry, and SLAM. Int. J. Robot. Res. 36(2), 142–149 (2016)
Acknowledgments
This work has been supported by the Spanish projects PID2021-124137OB-I00 and PGC2018-098719-B-I00 (MCIU / AEI / FEDER, UE), by the Gobierno de Aragón under Project DGA T45-20R, by the Universidad de Zaragoza and Banco Santander, by the Consejo Nacional de Ciencia y Tecnología (CONACYT-Mexico), and by Spanish grant FPU20/03134.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Perez-Salesa, I., Aldana-López, R., Sagüés, C. (2023). Event-Based Visual Tracking in Dynamic Environments. In: Tardioli, D., Matellán, V., Heredia, G., Silva, M.F., Marques, L. (eds) ROBOT2022: Fifth Iberian Robotics Conference. ROBOT 2022. Lecture Notes in Networks and Systems, vol 589. Springer, Cham. https://doi.org/10.1007/978-3-031-21065-5_15
Download citation
DOI: https://doi.org/10.1007/978-3-031-21065-5_15
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-21064-8
Online ISBN: 978-3-031-21065-5
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)