Multi-Run: An Approach for Filling in Missing Information of 3D Roadside Reconstruction

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9555)

Abstract

This paper presents an approach for incrementally adding missing information into a point cloud generated for 3D roadside reconstruction. We use a series of video sequences recorded while driving repeatedly through the road to be reconstructed. The video sequences can also be recorded while driving in opposite directions. We call this a multi-run scenario. The only extra input data other than stereo images is the reading from a GPS sensor, which is used as guidance for merging point clouds from different sequences into one. The quality of the 3D roadside reconstruction is in direct relationship to the accuracy of the applied egomotion estimation method. A main part of our motion analysis method is defined by visual odometry following a traditional workflow in this area: first, establish correspondences of tracked features between two subsequent frames; second, use a stereo-matching algorithm to calculate the depth information of the tracked features; then compute the motion data between every two frames using a perspective-n-point solver. Additionally, we propose a technique that uses a Kalman-filter fusion to track the selected feature points, and to filter outliers. Furthermore, we use the GPS data to bound the overall propagation of the positioning errors. Experiments are given with trajectory estimation and 3D scene reconstruction. We evaluate our approach by estimating the recovery of (so far) missing information when analysing data recorded in a subsequent run.

Keywords

Multi-run scenario Motion analysis Visual odometry Kalman filter GPS data 3D reconstruction Multi-sensory integration 

References

  1. 1.
    Badino, H., Franke, U., Rabe, C., Gehrig, S.: Stereo vision-based detection of moving objects under strong camera motion. Proc. Comput. Vis. Theor. Appl. 2, 253–260 (2006)Google Scholar
  2. 2.
    Badino, H., Kanade, T.: A head-wearable short-baseline stereo system for the simultaneous estimation of structure and motion. In: Proceedings IAPR Config Machine Vision Applications, pp. 185–189 (2011)Google Scholar
  3. 3.
    Badino, H., Yamamoto, A., Kanade, T.: Visual odometry by multi-frame feature integration. In: Proceedings of the ICCV Workshop Computer Vision Autonomous Driving, pp. 222–229 (2013)Google Scholar
  4. 4.
    Besl, P., McKay, N.D.: A method for registration of 3-d shapes. IEEE Trans. Pattern Anal. Mach. Intell. 14, 239–256 (1992)CrossRefGoogle Scholar
  5. 5.
    Chien, H.J., Geng, H., Klette, R.: Visual odometry based on transitivity error analysis in disparity space - A third-eye approach. In: Proceedings IVCNZ, pp. 72–77 (2014)Google Scholar
  6. 6.
    Demirdjian, D., Darrell, T.: Motion estimation from disparity images. Proc. ICCV 1, 213–218 (2001)Google Scholar
  7. 7.
    Franke, U., Rabe, C., Badino, H., Gehrig, S.K.: 6D-Vision: Fusion of stereo and motion for robust environment perception. In: Kropatsch, W.G., Sablatnig, R., Hanbury, A. (eds.) DAGM 2005. LNCS, vol. 3663, pp. 216–223. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  8. 8.
    Geiger, A., Lenz, P., Stiller, C., Urtasun, R.: Vision meets robotics: The KITTI dataset. Int. J. Robot. Res. 32, 1231–1237 (2013)CrossRefGoogle Scholar
  9. 9.
    Hirschmüller, H., Gehrig, S.: Stereo matching in the presence of sub-pixel calibration errors. In: Proceedings CVPR, pp. 437–444 (2009)Google Scholar
  10. 10.
    Julier, S., Uhlmann, J.: Unscented filtering and nonlinear estimation. Proc. IEEE 92, 401–422 (2004)CrossRefGoogle Scholar
  11. 11.
    Kitt, B., Geiger, A., Lategahn, H.: Visual odometry based on stereo image sequences with RANSAC-based outlier rejection scheme. In: Proceedings IEEE Intelligent Vehicles Symposium, pp. 486–492 (2010)Google Scholar
  12. 12.
    Klette, R.: Concise Computer Vision. Springer, London (2014)CrossRefMATHGoogle Scholar
  13. 13.
    Maimone, M., Cheng, Y., Matthies, L.: Two years of visual odometry on the Mars exploration rovers. J. Field Robot. 24, 169–186 (2007)CrossRefGoogle Scholar
  14. 14.
    Matthies, L.: Dynamic stereo vision. Ph.D. dissertation, Carnegie Mellon University (1989)Google Scholar
  15. 15.
    Matthies, L., Shafer, S.A.: Error modeling in stereo navigation. IEEE J. Rob. Autom. 3, 239–250 (1987)CrossRefGoogle Scholar
  16. 16.
    Milella, A., Siegwart, R.: Stereo-based ego-motion estimation using pixel tracking and iterative closest point. In: Proceedings IEEE International Conference Computer Vision Systems, p. 21 (2006)Google Scholar
  17. 17.
    Musialski, P., Wonka, P., Aliaga, D., Wimmer, M., Gool, L., Purgathofer, W.: A survey of urban reconstruction. J. Comput. Graph. Forum 32, 146–177 (2013)CrossRefGoogle Scholar
  18. 18.
    Olson, C., Matthies, L., Schoppers, M., Maimone, M.: Stereo ego-motion improvements for robust rover navigation. Proc. IEEE Int. Conf. Robot. Autom. 2, 1099–1104 (2001)Google Scholar
  19. 19.
    Rublee, E., Rabaud, V., Konolige, K., Bradski, G.: Orb: An efficient alternative to SIFT or SURF. In: Proceedings of the ICCV, pp. 2564–2571 (2011)Google Scholar
  20. 20.
    Scaramuzza, D., Fraundorfer, F.: Visual odometry tutorial. Robot. Autom. Mag. 18, 80–92 (2011)CrossRefGoogle Scholar
  21. 21.
    Shakernia, O., Vidal, R., Sastry, S.: Omnidirectional egomotion estimation from back-projection flow. Proc. CVPR Workshop 7, 82 (2003)Google Scholar
  22. 22.
    Sibley, G., Sukhatme, G.S., Matthies, L.: The iterated sigma point Kalman filter with applications to long range stereo. In: Proceedings Robotics Science Systems (2006)Google Scholar
  23. 23.
    Song, Z., Klette, R.: Robustness of point feature detection. In: Wilson, R., Hancock, E., Bors, A., Smith, W. (eds.) CAIP 2013, Part II. LNCS, vol. 8048, pp. 91–99. Springer, Heidelberg (2013)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Haokun Geng
    • 1
  • Hsiang-Jen Chien
    • 2
  • Reinhard Klette
    • 2
  1. 1.Department of Computer ScienceThe University of AucklandAucklandNew Zealand
  2. 2.School of EngineeringAuckland University of TechnologyAucklandNew Zealand

Personalised recommendations