Advertisement

Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera

  • Hanme Kim
  • Stefan Leutenegger
  • Andrew J. Davison
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9910)

Abstract

We propose a method which can perform real-time 3D reconstruction from a single hand-held event camera with no additional sensing, and works in unstructured scenes of which it has no prior knowledge. It is based on three decoupled probabilistic filters, each estimating 6-DoF camera motion, scene logarithmic (log) intensity gradient and scene inverse depth relative to a keyframe, and we build a real-time graph of these to track and model over an extended local workspace. We also upgrade the gradient estimate for each keyframe into an intensity image, allowing us to recover a real-time video-like intensity sequence with spatial and temporal super-resolution from the low bit-rate input event stream. To the best of our knowledge, this is the first algorithm provably able to track a general 6D motion along with reconstruction of arbitrary structure including its intensity and the reconstruction of grayscale video that exclusively relies on event camera data.

Keywords

6-DoF tracking 3D reconstruction Intensity reconstruction Visual odometry SLAM Event-based camera 

Notes

Acknowledgements

Hanme Kim was supported by an EPSRC DTA scholarship and the Qualcomm Innovation Fellowship 2014. We thank Jacek Zienkiewicz, Ankur Handa, Patrick Bardow, Edward Johns and other colleagues at Imperial College London for many useful discussions.

References

  1. 1.
    Bardow, P., Davison, A.J., Leutenegger, S.: Simultaneous optical flow and intensity estimation from an event camera. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2016)Google Scholar
  2. 2.
    Boahen, K.: Neuromorphic Chips. Scientific American (2005)Google Scholar
  3. 3.
    Brandli, C., Berner, R., Yang, M., Liu, S.C., Delbruck, T.: A 240 \(\times \) 180 130 dB 3 \(\mu \)s latency global shutter spatiotemporal vision sensor. IEEE J. Solid-State Circ. (JSSC) 49(10), 2333–2341 (2014)CrossRefGoogle Scholar
  4. 4.
    Carneiro, J., Ieng, S., Posch, C., Benosman, R.: Event-based 3D reconstruction from neuromorphic retinas. J. Neural Netw. 45, 27–38 (2013)CrossRefGoogle Scholar
  5. 5.
    Censi, A., Scaramuzza, D.: Low-latency event-based visual odometry. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2014)Google Scholar
  6. 6.
    Conradt, J., Cook, M., Berner, R., Lichtsteiner, P., Douglas, R., Delbruck, T.: A pencil balancing robot using a pair of AER dynamic vision sensors. In: IEEE International Symposium on Circuits and Systems (ISCAS) (2009)Google Scholar
  7. 7.
    Cook, M., Gugelmann, L., Jug, F., Krautz, C., Steger, A.: Interacting maps for fast visual interpretation. In: Proceedings of the International Joint Conference on Neural Networks (IJCNN) (2011)Google Scholar
  8. 8.
    Delbruck, T., Lichtsteiner, P.: Fast sensory motor control based on event-based hybrid neuromorphic-procedural system. In: IEEE International Symposium on Circuits and Systems (ISCAS) (2007)Google Scholar
  9. 9.
    Engel, J., Sturm, J., Cremers, D.: Semi-dense visual odometry for a monocular camera. In: Proceedings of the International Conference on Computer Vision (ICCV) (2013)Google Scholar
  10. 10.
    Engel, J., Schöps, T., Cremers, D.: LSD-SLAM: large-scale direct monocular SLAM. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds.) ECCV 2014. LNCS, vol. 8690, pp. 834–849. Springer, Heidelberg (2014). doi: 10.1007/978-3-319-10605-2_54 Google Scholar
  11. 11.
    Handa, A., Newcombe, R.A., Angeli, A., Davison, A.J.: Applications of the Legendre-Fenchel transformation to computer vision problems. Technical report DTR11-7, Imperial College London (2011)Google Scholar
  12. 12.
    Kim, H., Handa, A., Benosman, R., Ieng, S.H., Davison, A.J.: Simultaneous mosaicing and tracking with an event camera. In: Proceedings of the British Machine Vision Conference (BMVC) (2014)Google Scholar
  13. 13.
    Klein, G., Murray, D.W.: Parallel tracking and mapping for small AR workspaces. In: Proceedings of the International Symposium on Mixed and Augmented Reality (ISMAR) (2007)Google Scholar
  14. 14.
    Lichtsteiner, P., Posch, C., Delbruck, T.: A 128 \(\times \) 128 120 dB 15 \(\mu \)s latency asynchronous temporal contrast vision sensor. IEEE J. Solid-State Circ. (JSSC) 43(2), 566–576 (2008)CrossRefGoogle Scholar
  15. 15.
    Matsuda, N., Cossairt, O., Gupta, M.: MC3D: motion contrast 3D scanning. In: Proceedings of the IEEE International Conference on Computational Photography (ICCP) (2015)Google Scholar
  16. 16.
    Milford, M., Kim, H., Leutenegger, S., Davison, A.J.: Towards visual SLAM with event-based cameras. In: The Problem of Mobile Sensors: Setting Future Goals and Indicators of Progress for SLAM Workshop in Conjunction with Robotics: Science and Systems (RSS) (2015)Google Scholar
  17. 17.
    Milford, M., Kim, H., Mangan, M., Leutenegger, S., Stone, T., Webb, B., Davison, A.J.: Place recognition with event-based cameras and a neural implementation of SeqSLAM. In: The Innovative Sensing for Robotics: Focus on Neuromorphic Sensors Workshop at the IEEE International Conference on Robotics and Automation (ICRA) (2015)Google Scholar
  18. 18.
    Möller, T., Trumbore, B.: Fast, minimum storage ray/triangle intersection. J. Graph. Tools 2(1), 21–28 (1997)CrossRefGoogle Scholar
  19. 19.
    Mueggler, E., Huber, B., Scaramuzza, D.: Event-based, 6-DOF pose tracking for high-speed maneuvers. In: Proceedings of the IEEE/RSJ Conference on Intelligent Robots and Systems (IROS) (2014)Google Scholar
  20. 20.
    Newcombe, R.A., Lovegrove, S., Davison, A.J.: DTAM: dense tracking and mapping in real-time. In: Proceedings of the International Conference on Computer Vision (ICCV) (2011)Google Scholar
  21. 21.
    Posch, C., Matolin, D., Wohlgenannt, R.: A QVGA 143 dB dynamic range frame-free PWM image sensor with lossless pixel-level video compression and time-domain CDS. IEEE J. Solid-State Circ. (JSSC) 46(1), 259–275 (2011)CrossRefGoogle Scholar
  22. 22.
    Schraml, S., Belbachir, A.N., Bischof, H.: Event-driven stereo matching for real-time 3D panoramic vision. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2015)Google Scholar
  23. 23.
    Weikersdorfer, D., Adrian, D.B., Cremers, D., Conradt, J.: Event-based 3D SLAM with a depth-augmented dynamic vision sensor. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2014)Google Scholar
  24. 24.
    Weikersdorfer, D., Hoffmann, R., Conradt, J.: Simultaneous localization and mapping for event-based vision systems. In: International Conference on Computer Vision Systems (ICVS) (2013)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Hanme Kim
    • 1
  • Stefan Leutenegger
    • 1
  • Andrew J. Davison
    • 1
  1. 1.Department of ComputingImperial College LondonLondonUK

Personalised recommendations