Skip to main content

Event-Based Visual Tracking in Dynamic Environments

  • Conference paper
  • First Online:
ROBOT2022: Fifth Iberian Robotics Conference (ROBOT 2022)

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 589))

Included in the following conference series:

Abstract

Visual object tracking under challenging conditions of motion and light can be hindered by the capabilities of conventional cameras, prone to producing images with motion blur. Event cameras are novel sensors suited to robustly perform vision tasks under these conditions. However, due to the nature of their output, applying them to object detection and tracking is non-trivial. In this work, we propose a framework to take advantage of both event cameras and off-the-shelf deep learning for object tracking. We show that reconstructing event data into intensity frames improves the tracking performance in conditions under which conventional cameras fail to provide acceptable results.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Gallego, G., et al.: Event-based vision: a survey. IEEE Trans. Patt. Anal. Mach. Intell.44(1), 154–180 (2022)

    Google Scholar 

  2. Zhang, J., Yang, X., Fu, Y., Wei, X., Yin, B., Dong, B.: Object tracking by jointly exploiting frame and event domain. In: IEEE/CVF International Conference on Computer Vision, pp. 13043–13052 (2021)

    Google Scholar 

  3. Wang, X., et al.: VisEvent: reliable object tracking via collaboration of frame and event flows. arXiv:2108.05015 (2021)

  4. Liu, H., Moeys, D.P., Das, G., Neil, D., Liu, S.C., Delbrück, T.: Combined frame- and event-based detection and tracking. In: 2016 IEEE International Symposium on Circuits and Systems (ISCAS), pp. 2511–2514 (2016)

    Google Scholar 

  5. Jiang, R., Wang, Q., Shi, S., Mou, X., Chen, S.: Flow-assisted visual tracking using event cameras. CAAI Trans. Intell. Technol. 6(2), 192–202 (2021)

    Article  Google Scholar 

  6. Perot, E., de Tournemire, P., Nitti, D., Masci, J., Sironi, A.: Learning to detect objects with a 1 megapixel event camera. In: Advances in Neural Information Processing Systems. vol. 33, pp. 16639–16652 (2020)

    Google Scholar 

  7. Iacono, M., Weber, S., Glover, A., Bartolozzi, C.: Towards event-driven object detection with off-the-shelf deep learning. In: IEEE International Conference on Intelligent Robots and Systems, pp. 6277–6283. Institute of Electrical and Electronics Engineers Inc. (2018)

    Google Scholar 

  8. Jiang, R., et al.: Object tracking on event cameras with offline-online learning. CAAI Trans. Intell. Technol. 5(3), 165–171 (2020)

    Google Scholar 

  9. Li, H., Shi, L.: Robust event-based object tracking combining correlation filter and CNN representation. Front. Neurorobot. 13, 82 (2019). https://www.frontiersin.org/article/10.3389/fnbot.2019.00082/full

  10. Pérez Cutiño, M., Gómez Eguíluz, A., Martinez-de Dios, J.R., Ollero, A.: Event-based human intrusion detection in UAS using deep learning. In: 2021 International Conference on Unmanned Aircraft Systems (ICUAS), pp. 91–100 (06 2021)

    Google Scholar 

  11. Rebecq, H., Ranftl, R., Koltun, V., Scaramuzza, D.: High speed and high dynamic range video with an event camera. IEEE Trans. Patt. Anal. Mach. Intell. 43(6), 1964–1980 (2021)

    Article  Google Scholar 

  12. Scheerlinck, C., Rebecq, H., Gehrig, D., Barnes, N., Mahony, R.E., Scaramuzza, D.: Fast image reconstruction with an event camera. In: 2020 IEEE Winter Conference on Applications of Computer Vision (WACV), pp. 156–163 (2020)

    Google Scholar 

  13. Jocher, G.: YOLOv5n ‘Nano’ models, Roboflow integration, TensorFlow export. OpenCV DNN support (2021). https://doi.org/10.5281/zenodo.5563715

  14. Söderström, T.: Discrete-Time Stochastic Systems. Springer, London (2002)

    Book  MATH  Google Scholar 

  15. Kalman, R.E.: A new approach to linear filtering and prediction problems. J. Basic Eng. 82(1), 35–45 (1960)

    Article  MathSciNet  Google Scholar 

  16. Mueggler, E., Rebecq, H., Gallego, G., Delbruck, T., Scaramuzza, D.: The Event-Camera Dataset and Simulator: event-based data for pose estimation, visual odometry, and SLAM. Int. J. Robot. Res. 36(2), 142–149 (2016)

    Article  Google Scholar 

Download references

Acknowledgments

This work has been supported by the Spanish projects PID2021-124137OB-I00 and PGC2018-098719-B-I00 (MCIU / AEI / FEDER, UE), by the Gobierno de Aragón under Project DGA T45-20R, by the Universidad de Zaragoza and Banco Santander, by the Consejo Nacional de Ciencia y Tecnología (CONACYT-Mexico), and by Spanish grant FPU20/03134.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Irene Perez-Salesa .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Perez-Salesa, I., Aldana-López, R., Sagüés, C. (2023). Event-Based Visual Tracking in Dynamic Environments. In: Tardioli, D., Matellán, V., Heredia, G., Silva, M.F., Marques, L. (eds) ROBOT2022: Fifth Iberian Robotics Conference. ROBOT 2022. Lecture Notes in Networks and Systems, vol 589. Springer, Cham. https://doi.org/10.1007/978-3-031-21065-5_15

Download citation

Publish with us

Policies and ethics