Skip to main content
Log in

Visual Odometry from the Images of the Reference Surface with Small Interframe Rotations

  • THEMATIC ISSUE
  • Published:
Automation and Remote Control Aims and scope Submit manuscript

Abstract

The paper considers the problem of visual odometry based on a sequence of video frames formed using a camera perpendicularly downward facing the reference surface. The problem is solved under the assumption that the shooting frequency is high, so that the interframe rotation and shift parameters are small. The technology is implemented in the form of a sequence of the following steps: determining the shift and rotation with an accuracy of an integer number of pixels using the correlation method, clarifying the shift and rotation parameters using the optical flow method, and correcting estimation errors associated with uneven motion and fluctuations in the distance of the camera to the reference surface by estimating deviations of local calibration characteristics from their mean values. The results of experimental studies of the technology on test trajectories obtained by simulating the motion of a vehicle along the reference surface are presented.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.
Fig. 5.
Fig. 6.

Similar content being viewed by others

REFERENCES

  1. Delmerico, J. and Scaramuzza, D., A benchmark comparison of monocular visual-inertial odometry algorithms for flying robots, IEEE Int. Conf. Rob. Autom. (ICRA) (Brisbane, Australia, 2018), pp. 2502–2509.

  2. He, M., Zhu, C., Huang, Q., Ren, B., and Liu, J., A review of monocular visual odometry, Vis. Comput., 2020, vol. 36, pp. 1053–1065.

    Article  Google Scholar 

  3. Gao, L., Su, J., Cui, J., Zeng, X., Peng, X., and Kneip, L., Efficient globally-optimal correspondence-less visual odometry for planar ground vehicles, IEEE Int. Conf. Rob. Autom. (ICRA) (Paris, France, 2020), pp. 2696–2702.

  4. Forster, C., Zhang, Z., Gassner, M., Werlberger, M., and Scaramuzza, D., SVO: Semidirect visual odometry for monocular and multicamera systems, IEEE Trans. Rob., 2017, vol. 33, no. 2, pp. 249–265.

    Article  Google Scholar 

  5. Muller, P. and Savakis, A., Flowdometry: An optical flow and deep learning based approach to visual odometry, IEEE Winter Conf. Appl. Comput. Vision (WACV) (Santa Rosa, USA, 2017), pp. 624–631.

  6. Gonzalez, R., Rituerto, A., and Guerrero, J.J., Improving robot mobility by combining downward-looking and frontal cameras, Robotics, 2016, vol. 5, no. 4, p. 25.

    Article  Google Scholar 

  7. Birem, M., Kleihorst, R., and El-Ghouti, N., Visual odometry based on the Fourier transform using a monocular ground-facing camera, J. Real-Time Image Process., 2018, vol. 14, pp. 637–646.

    Article  Google Scholar 

  8. Gilles, M. and Ibrahimpasic, S., Unsupervised deep learning based ego motion estimation with a downward facing camera, Vis. Comput., 2021.

  9. Fu, B., Shankar, K.S., and Michael, N., RaD-VIO: Rangefinder-aided downward visual-inertial odometry, Int. Conf. Rob. Autom. (ICRA) (Montreal, Canada, 2019), pp. 1841–1847.

  10. Goecke, R., Asthana, A., Pettersson, N., and Petersson, L., Visual vehicle egomotion estimation using the Fourier–Mellin transform, IEEE Intell. Veh. Symp. (Istanbul, Turkey, 2007), pp. 450–455.

  11. Nourani-Vatani, N. and Borges, P.V.K., Correlation-based visual odometry for ground vehicles, J. Field Rob., 2011, vol. 28, no. 5, pp. 742–768.

    Article  MATH  Google Scholar 

  12. Fursov, V.A., Minaev, E.Y., and Kotov, A.P., Vehicle motion estimation using visual observations of the elevation surface, Autom. Remote Control, 2021, vol. 82, no. 10, pp. 1730–1741.

    Article  MATH  Google Scholar 

  13. Kitt, B., Geiger, A., and Lategahn, H., Visual odometry based on stereo image sequences with RANSAC-based outlier rejection scheme, IEEE Intell. Veh. Symp. (La Jolla, USA, 2010), pp. 486–492.

  14. Pire, T., Fischer, T., Civera, J., Cristoforis, P.D., and Berlles, J.J., Stereo parallel tracking and mapping for robot localization, IEEE/RSJ Int. Conf. Intell. Rob. Syst. (IROS) (Hamburg, Germany, 2015), pp 1373–1378.

  15. Mur-Artal, R. and Tardos, J.D., Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras, IEEE Trans. Rob., 2017, vol. 33, no. 5, pp. 1255–1262.

    Article  Google Scholar 

  16. Newcombe, R.A., Lovegrove, S.J., and Davison, A.J., Dtam: Dense tracking and mapping in real-time, Int. Conf. Comput. Vis. (Barcelona, Spain, 2011), pp. 2320–2327.

  17. Engel, J., Schops, T., and Cremers, D., Lsd-slam: Large-scale direct monocular slam, Eur. Conf. Comput. Vis. (Zurich, Switzerland, 2014), pp. 834–849.

  18. Kerl, C., Sturm, J., and Cremers, D., Dense visual slam for rgb-d cameras, IEEE/RSJ Int. Conf. Intell. Rob. Syst. (Tokyo, Japan, 2013), pp. 2100–2106.

  19. https://www.unrealengine.com. Unreal Engine 4—Game Engine, 2021.

  20. https://github.com/by-LZ-for/Synthetic-Dataset-Robot. Synthetic Dataset for Visual Odometry, 2021.

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to E. Yu. Minaev, L. A. Zherdeva or V. A. Fursov.

Additional information

Translated by V. Potapchouck

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Minaev, E.Y., Zherdeva, L.A. & Fursov, V.A. Visual Odometry from the Images of the Reference Surface with Small Interframe Rotations. Autom Remote Control 83, 1496–1506 (2022). https://doi.org/10.1134/S00051179220100022

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S00051179220100022

Keywords

Navigation