Skip to main content

Events-to-Frame: Bringing Visual Tracking Algorithm to Event Cameras

  • Conference paper
  • First Online:
Digital TV and Wireless Multimedia Communication (IFTC 2020)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1390))

  • 1228 Accesses

Abstract

Event based cameras mean a significant shift to standard cameras by mimicking the work of the biological retina. Unlike the traditional cameras which output the image directly, they provide the relevant information asynchronously through the light intensity changes. This can produce a series of events that include the time, position, and polarity. Visual tracking based on event camera is a new research topic. In this paper, by accumulating a fixed number of events, the output of events stream by the event camera is transformed into the image representation. And it is applied to the tracking algorithm of the ordinary camera. In addition, the data sets of the ground-truth is relabeled and with the visual attributes such as noise events, occlusion, deformation and so on so that it can facilitate the evaluation of the tracker. The data sets are tested in the existing tracking algorithms. Extensive experiments have proved that the data sets created is reasonable and effective. And it can achieve fast and efficient target tracking through the SOTA tracking algorithm test.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Babenko, B., Yang, M.H., Belongie, S.: Robust object tracking with online multiple instance learning. IEEE Trans. Pattern Anal. Mach. Intell. 33(8), 1619–1632 (2010)

    Article  Google Scholar 

  2. Bertinetto, L., Valmadre, J., Golodetz, S., Miksik, O., Torr, P.H.: Staple: complementary learners for real-time tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1401–1409 (2016)

    Google Scholar 

  3. Bhat, G., Danelljan, M., Gool, L.V., Timofte, R.: Learning discriminative model prediction for tracking. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 6182–6191 (2019)

    Google Scholar 

  4. Bolme, D.S., Beveridge, J.R., Draper, B.A., Lui, Y.M.: Visual object tracking using adaptive correlation filters. In: 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 2544–2550. IEEE (2010)

    Google Scholar 

  5. Chen, Z., Zhong, B., Li, G., Zhang, S., Ji, R.: Siamese box adaptive network for visual tracking. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 6668–6677 (2020)

    Google Scholar 

  6. Danelljan, M., Häger, G., Khan, F.S., Felsberg, M.: Discriminative scale space tracking. IEEE Trans. Pattern Anal. Mach. Intell. 39(8), 1561–1575 (2016)

    Article  Google Scholar 

  7. Drazen, D., Lichtsteiner, P., Häfliger, P., Delbrück, T., Jensen, A.: Toward real-time particle tracking using an event-based dynamic vision sensor. Exp. Fluids 51(5), 1465 (2011). https://doi.org/10.1007/s00348-011-1207-y

    Article  Google Scholar 

  8. Gallego, G., et al.: Event-based vision: a survey. arXiv preprint arXiv:1904.08405 (2019)

  9. Gehrig, D., Rebecq, H., Gallego, G., Scaramuzza, D.: Asynchronous, photometric feature tracking using events and frames. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018. LNCS, vol. 11216, pp. 766–781. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01258-8_46

    Chapter  Google Scholar 

  10. Glover, A., Bartolozzi, C.: Robust visual tracking with a freely-moving event camera. In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 3769–3776. IEEE (2017)

    Google Scholar 

  11. Grabner, H., Leistner, C., Bischof, H.: Semi-supervised on-line boosting for robust tracking. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) ECCV 2008. LNCS, vol. 5302, pp. 234–247. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-88682-2_19

    Chapter  Google Scholar 

  12. Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: Exploiting the circulant structure of tracking-by-detection with kernels. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) ECCV 2012. LNCS, vol. 7575, pp. 702–715. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-33765-9_50

    Chapter  Google Scholar 

  13. Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 583–596 (2014)

    Article  Google Scholar 

  14. Hu, Y., Liu, H., Pfeiffer, M., Delbruck, T.: DVS benchmark datasets for object tracking, action recognition, and object recognition. Front. Neurosci. 10, 405 (2016)

    Article  Google Scholar 

  15. Zhang, K., Zhang, L., Yang, M.-H.: Real-time compressive tracking. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) ECCV 2012. LNCS, vol. 7574, pp. 864–877. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-33712-3_62

    Chapter  Google Scholar 

  16. Kalal, Z., Mikolajczyk, K., Matas, J.: Tracking-learning-detection. IEEE Trans. Pattern Anal. Mach. Intell. 34(7), 1409–1422 (2011)

    Article  Google Scholar 

  17. Kogler, J., Sulzbachner, C., Kubinger, W.: Bio-inspired stereo vision system with silicon retina imagers. In: Fritz, M., Schiele, B., Piater, J.H. (eds.) ICVS 2009. LNCS, vol. 5815, pp. 174–183. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-642-04667-4_18

    Chapter  Google Scholar 

  18. Lagorce, X., Meyer, C., Ieng, S.H., Filliat, D., Benosman, R.: Asynchronous event-based multikernel algorithm for high-speed visual features tracking. IEEE Trans. Neural Netw. Learn. Syst. 26(8), 1710–1720 (2014)

    Article  MathSciNet  Google Scholar 

  19. Lazzaro, J., Wawrzynek, J.: A multi-sender asynchronous extension to the AER protocol. In: Proceedings Sixteenth Conference on Advanced Research in VLSI, pp. 158–169. IEEE (1995)

    Google Scholar 

  20. Li, H., Shi, L.: Robust event-based object tracking combining correlation filter and CNN representation. Front. Neurorobot. 13, 82 (2019)

    Google Scholar 

  21. LuNežič, A., Vojíř, T., Čehovin Zajc, L., Matas, J., Kristan, M.: Discriminative correlation filter tracner with channel and spatial reliability. Int. J. Comput. Vision 126(7), 671–688 (2018)

    Article  MathSciNet  Google Scholar 

  22. Ma, C., Yang, X., Zhang, C., Yang, M.H.: Long-term correlation tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 5388–5396 (2015)

    Google Scholar 

  23. Manderscheid, J., Sironi, A., Bourdis, N., Migliore, D., Lepetit, V.: Speed invariant time surface for learning to detect corner points with event-based cameras. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 10245–10254 (2019)

    Google Scholar 

  24. Nam, H., Han, B.: Learning multi-domain convolutional neural networks for visual tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4293–4302 (2016)

    Google Scholar 

  25. Ni, Z., Pacoret, C., Benosman, R., Ieng, S., RÉGNIER*, S.: Asynchronous event-based high speed vision for microparticle tracking. J. Microsc. 245(3), 236–244 (2012)

    Google Scholar 

  26. Paredes-Vallés, F., Scheper, K.Y.W., De Croon, G.C.H.E.: Unsupervised learning of a hierarchical spiking neural network for optical flow estimation: from events to global motion perception. IEEE Trans. Pattern Anal. Mach. Intell. 42, 2051–2061 (2019)

    Google Scholar 

  27. Patrick, L., Posch, C., Delbruck, T.: A 128x 128 120 db 15\(\mu \) s latency asynchronous temporal contrast vision sensor. IEEE J. Solid-State Circuits 43, 566–576 (2008)

    Article  Google Scholar 

  28. Ramesh, B., Zhang, S., Lee, Z.W., Gao, Z., Orchard, G., Xiang, C.: Long-term object tracking with a moving event camera. In: BMVC, p. 241 (2018)

    Google Scholar 

  29. Rebecq, H., Ranftl, R., Koltun, V., Scaramuzza, D.: Events-to-video: bringing modern computer vision to event cameras. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3857–3866 (2019)

    Google Scholar 

  30. Schraml, S., Belbachir, A.N., Milosevic, N., Schön, P.: Dynamic stereo vision system for real-time tracking. In: Proceedings of 2010 IEEE International Symposium on Circuits and Systems, pp. 1409–1412. IEEE (2010)

    Google Scholar 

  31. Seok, H., Lim, J.: Robust feature tracking in DVS event stream using Bézier mapping. In: The IEEE Winter Conference on Applications of Computer Vision, pp. 1658–1667 (2020)

    Google Scholar 

  32. Vidal, A.R., Rebecq, H., Horstschaefer, T., Scaramuzza, D.: Ultimate SLAM? Combining events, images, and IMU for robust visual SLAM in HDR and high-speed scenarios. IEEE Robot. Autom. Lett. 3(2), 994–1001 (2018)

    Article  Google Scholar 

  33. Wang, Q., Zhang, L., Bertinetto, L., Hu, W., Torr, P.H.: Fast online object tracking and segmentation: a unifying approach. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1328–1338 (2019)

    Google Scholar 

  34. Wang, Q., Zhang, Y., Yuan, J., Lu, Y.: Space-time event clouds for gesture recognition: from RGB cameras to event cameras. In: 2019 IEEE Winter Conference on Applications of Computer Vision (WACV), pp. 1826–1835. IEEE (2019)

    Google Scholar 

  35. Xu, J., Jiang, M., Yu, L., Yang, W., Wang, W.: Robust motion compensation for event cameras with smooth constraint. IEEE Trans. Comput. Imaging 6, 604–614 (2020)

    Article  Google Scholar 

  36. Zhu, A.Z., Atanasov, N., Daniilidis, K.: Event-based feature tracking with probabilistic data association. In: 2017 IEEE International Conference on Robotics and Automation (ICRA), pp. 4465–4470. IEEE (2017)

    Google Scholar 

Download references

Acknowledgment

This work is supported by National Natural Science Foundation of China under Grant No. 61906168 and Zhejiang Provincial Natural Science Foundation of China under Grant No. LY18F020032.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nan Chen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Chan, S., Liu, Q., Zhou, X., Bai, C., Chen, N. (2021). Events-to-Frame: Bringing Visual Tracking Algorithm to Event Cameras. In: Zhai, G., Zhou, J., Yang, H., An, P., Yang, X. (eds) Digital TV and Wireless Multimedia Communication. IFTC 2020. Communications in Computer and Information Science, vol 1390. Springer, Singapore. https://doi.org/10.1007/978-981-16-1194-0_18

Download citation

  • DOI: https://doi.org/10.1007/978-981-16-1194-0_18

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-16-1193-3

  • Online ISBN: 978-981-16-1194-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics