Self-calibration of gyro using monocular SLAM for an indoor mobile robot

  • Hyoung-Ki Lee
  • Kiwan Choi
  • Jiyoung Park
  • Hyun Myung
Regular Papers Robotics and Automation


Rate gyros are widely used to calculate the heading angle for mobile robot localization. They are normally calibrated in the factory using an expensive rate table prior to their use. In this paper, a self-calibration method using a monocular camera without a rate table is proposed. The suggested method saves time and cost for extra calibration procedure. SLAM (Simultaneous Localization And Mapping) based on visual features and odometry gives reference heading (yaw) angles. Using these, the coefficients of a scale factor function are estimated through Kalman filtering. A new undelayed feature initialization method is proposed to estimate the heading angle without any delay. Experimental results show the efficiency of the proposed method.


Feature initialization gyro Kalman filter scale factor self-calibration SLAM 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    W. Y. Jeong and K. M. Lee, “CV-SLAM: a new ceiling vision-based SLAM technique,” Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3195–3200, 2005.Google Scholar
  2. [2]
    D. Gebre-Egziabher, R. C. Hayward, and J. D. Powell, “A low-cost GPS/inertial attitude heading reference system (AHRS) for general aviation applications,” Proc. Position Location and Navigation Symposium, IEEE, pp. 518–525, 1998.Google Scholar
  3. [3]
    T.-C. Li, “Analysis of inertial navigation system errors from van testing using an optimal Kalman filter/smoother,” Proc. Position Location and Navigation Symposium, pp. 120–128, 2004.Google Scholar
  4. [4]
    M. D. Cecco, “Self-calibration of AGV inertialodometric navigation using absolute-reference measurements,” Proc. IEEE Instrumentation and Measurement Technology Conference, pp. 1513–1518, 2002.Google Scholar
  5. [5]
    J. Kelly and G. S. Sukhatme, “Visual-inertial simultaneous Localization, mapping and sensor-to-sensor self-calibration,” Proc. IEEE Int’l Symp. Computational Intelligence in Robotics and Automation, Daejeon, Korea, Dec. 2009.Google Scholar
  6. [6]
    J. Lobo and J. Dias, “Relative pose calibration between visual and inertial sensors,” Int’l J. Robotics Research, vol. 26, no. 6, pp. 561–575, June 2007.CrossRefGoogle Scholar
  7. [7]
    A. J. Davison, “Real-time simultaneous localization and mapping with a single camera,” Proc. International Conference on Computer Vision, 2003.Google Scholar
  8. [8]
    J. M. M. Montiel and A. J. Davison, “A visual compass based on SLAM,” Proc. IEEE International Conference on Robotics and Automation, pp. 1917–1922, 2006.Google Scholar
  9. [9]
    E. Eade and T. Drummond, “Scalable monocular SLAM,” Proc. IEEE Conference on Computer Vision and Pattern Recognition, 2006.Google Scholar
  10. [10]
    J. M. M. Montiel, J. Civera, and A. J. Davison, “Unified inverse depth parameterization for monocular SLAM,” Proc. Robotics: Science and Systems, Philadelphia, USA, 2006.Google Scholar
  11. [11]
    J. Civera, A. J. Davison, and J. M. M. Montiel, “Inverse depth to depth conversion for monocular SLAM,” Proc. IEEE International Conference on Robotics and Automation (ICRA), Roma, Italy, 2007.Google Scholar
  12. [12]
    H. Chung, L. Ojeda, and J. Borenstein, “Sensor fusion for mobile robot dead-reckoning with a precision-calibrated fiber optic gyroscope,” Proc. IEEE International Conference on Robotics and Automation, pp. 3588–3593, 2001.Google Scholar
  13. [13]
    H. Myung, H.-K. Lee, K. Choi, and S. Bang, “Mobile robot Localization with gyroscope and constrained Kalman filter,” International Journal of Control, Automation, and Systems, vol. 8, no. 3, pp. 667–676, June 2010.CrossRefGoogle Scholar
  14. [14]
    C. Harris and M. Stephens, “A combined corner and edge detector,” Proc. of the 4th Alvey Vision Conference, pp. 147–151, 1988.Google Scholar
  15. [15]
    S. Baker and I. Mattews, “Lucas-Kanade 20 years on: a unifying framework Part 1,” International Journal of Computer Vision, vol. 56, no. 3, pp. 221–225, 2004.CrossRefGoogle Scholar
  16. [16]
    B. Barshan and H. F. Durrant-Whyte, “Inertial navigation systems for mobile robots,” IEEE Trans. on Robotics and Automation, vol. 11, no. 3, pp. 328–342, 1995.CrossRefGoogle Scholar
  17. [17]
    S. I. Roumeliotis, G. S. Sukhatme, and G. A. Bekey, “Circumventing dynamic modeling: evaluation of the error-state Kalman filter applied to mobile robot Localization,” Proc. IEEE International Conference on Robotics and Automation, pp. 1656–1663, 1999.Google Scholar
  18. [18]
    D. H. Titterson and J. L. Weston, Chapter 4 of Strapdown Inertial Navigation Technology, 2nd Edition, The Institution of Electrical Engineers, 2004.Google Scholar
  19. [19]
    J. D. Tardós, J. Neira, P. M. Newman, and J. J. Leonard, “Robust mapping and localization in indoor environments using Sonar data,” Int’l Journal of Robotics Research, vol. 21, no. 4, pp. 311–330, 2002.CrossRefGoogle Scholar
  20. [20]
    L. M. Paz, J. D. Tardós, and J. Neira, “Divide and Conquer: EKF SLAM in O(n),” IEEE Trans. on Robotics, vol. 24, no. 5, pp. 1107–1120, October 2008.CrossRefGoogle Scholar
  21. [21]
    S. Tully, H. Moon, G. Kantor, and H. Choset, “Iterated filters for bearing-only SLAM,” Proc. IEEE International Conference on Robotics and Automation, May 2008.Google Scholar

Copyright information

© Institute of Control, Robotics and Systems and The Korean Institute of Electrical Engineers and Springer-Verlag Berlin Heidelberg  2012

Authors and Affiliations

  • Hyoung-Ki Lee
    • 1
  • Kiwan Choi
    • 1
  • Jiyoung Park
    • 1
  • Hyun Myung
    • 2
  1. 1.MS Lab. of Samsung Advanced Institute of TechnologySamsung Electronics Co., Ltd.Gyeonggi-doKorea
  2. 2.Dept. of Civil & Environmental EngineeringKAISTDaejeonKorea

Personalised recommendations