Estimating Ego-Motion in Panoramic Image Sequences with Inertial Measurements

Conference paper
Part of the Springer Tracts in Advanced Robotics book series (STAR, volume 70)


This paper considers the problem of estimating the focus of expansion of optical flow fields from panoramic image sequences due to ego-motion of the camera. The focus of expansion provides a measurement of the direction of motion of the vehicle that is a key requirement for implementing obstacle avoidance algorithms. We propose a two stage approach to this problem. Firstly, external angular rotation measurements provided by an on-board inertial measurement unit are used to de-rotate the observed optic flow field. Then a robust statistical method is applied to provide an estimate of the focus of expansion as well as a selection of inlier data points associated with the hypothesis. This is followed by a least squares minimisation, utilising only the inlier data, that provides accurate estimates of residual angular rotation and focus of expansion of the flow. The least squares optimisation is solved using a geometric Newton algorithm. For the robust estimator we consider and compare RANSAC and a k-means algorithm. The approach in this paper does not require explicit features, and can be applied to patchy, noisy sparse optic flow fields. The approach is demonstrated in simulations and on video data obtained from an aerial robot equipped with panoramic cameras.


Inertial Measurement Unit Newton Iteration Newton Algorithm Bundle Adjustment Antipodal Point 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Absil, P.-A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton (2008)zbMATHGoogle Scholar
  2. 2.
    Adiv, G.: Inherent ambiguities in recovering 3-d motion and structure from a noisy flow field. IEEE Transactions on Pattern Analysis and Machine Intelligence 11, 477–489 (1989)CrossRefGoogle Scholar
  3. 3.
    Agrawal, A., Chellappa, R.: Ego-motion estimation and 3d model refinement in scenes with varying illumination. In: IEEE Workshop on Motion and Video Computing (MOTIONS 2005), vol. 2, pp. 40–146 (2005)Google Scholar
  4. 4.
    Baldwin, G., Mahony, R., Trumpf, J.: A nonlinear observer for 6 dof pose estimation from inertial and bearing measurements. In: IEEE International Conference on Robotics and Automation, ICRA 2009 (2009)Google Scholar
  5. 5.
    Brodsky, T., Fermuller, C., Aloimonos, Y.: Directions of motion fields are hardly ever ambiguous. International Journal of Computer Vision 26(1), 5–24 (1998)CrossRefGoogle Scholar
  6. 6.
    Fermuller, C., Aloimonos, Y.: Qualitative egomotion. International Journal of Computer Vision 15, 7–29 (1995)CrossRefGoogle Scholar
  7. 7.
    Fermuller, C., Aloimonos, Y.: Ambiguity in structure from motion: Sphere versus plane. International Journal of Computer Vision 28(2), 137–154 (1998)CrossRefGoogle Scholar
  8. 8.
    Fischler, M.A., Bolles, R.C.: Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM 24, 381–395 (1981)CrossRefMathSciNetGoogle Scholar
  9. 9.
    Hartigan, J.A., Wong, M.A.: Algorithm as 136: A k-means clustering algorithm. Applied Statistics 28(1), 100–108 (1979)CrossRefzbMATHGoogle Scholar
  10. 10.
    Hartley, R., Zisserman, A.: Multiple View Geometry in Computer Vision. Cambridge University Press, Cambridge (2000)zbMATHGoogle Scholar
  11. 11.
    Irani, M., Rousso, B., Peleg, S.: Recovery of ego-motion using region alignment. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(3), 268–272 (1997)CrossRefGoogle Scholar
  12. 12.
    Jepson, A., Heeger, D.: Subspace methods for recovering rigid motion i: algorithm and implementation. International Journal of Computer Vision 7(2) (1992)Google Scholar
  13. 13.
    Kanungo, T., Mount, D.M., Netanyahu, N., Piatko, C., Silverman, R., Wu, A.Y.: An efficient k-means clustering algorithm: Analysis and implementation. IEEE Transactions on Pattern Analysis and Machine Intelligence 24, 881–892 (2002)CrossRefGoogle Scholar
  14. 14.
    Koenderink, J., van Doorn, A.: Invariant properties of the motion parallax field due to the movement of rigid bodies relative to an observer. Optica Acta 22(9), 773–791 (1975)Google Scholar
  15. 15.
    Li, H., Hartley, R.: Five-point motion estimation made easy. In: 18th International Conference on Pattern Recognition (ICPR 2006), pp. 630–633 (2006)Google Scholar
  16. 16.
    Lim, J., Barnes, N.: Estimation of the epipole using optical flow at antipodal points. In: OMNIVIS (2007)Google Scholar
  17. 17.
    Lim, J., Barnes, N.: Directions of egomotion from antipodal points. In: Proceedings of CVPR (2008)Google Scholar
  18. 18.
    Longuet-Higgins, H.: A computer algorithm for reconstruction of a scene from two projections. Nature 293, 133–135 (1981)CrossRefGoogle Scholar
  19. 19.
    Mahony, R., Hamel, T., Pflimlin, J.-M.: Complementary filter design on the special orthogonal group so(3). In: 44th IEEE Conference on Decision and Control, 2005 and 2005 European Control Conference (CDC-ECC 2005), December 2005, pp. 1477–1484 (2005)Google Scholar
  20. 20.
    Mahony, R., Schill, F., Corke, P., Oh, Y.S.: A new framework for force feedback teleoperation of robotic vehicles based on optical flow. In: Proceedings IEEE International Conference on Robotics and Automation, ICRA (2009)Google Scholar
  21. 21.
    Makadia, A., Geyer, C., Sastry, S., Daniilidis, K.: Radonbased structure from motion without correspondences. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2005) (June 2005)Google Scholar
  22. 22.
    Nister, D.: An efficient solution to the five-point relative pose problem. IEEE Transactions on Pattern Analysis and Machine Intelligence, 756–770 (2004)Google Scholar
  23. 23.
    Roach, J.W., Aggarwal, J.K.: Determining the movement of objects from a sequence of images. IEEE Transactions on Pattern Analysis and Machine Intelligence 2(6), 55–62 (1980)Google Scholar
  24. 24.
    Schill, F., Mahony, R., Corke, P., Cole, L.: Virtual force feedback teleoperation of the insectbot using optic flow. In: Proceedings of the Australasian Conference on Rotoics and Automation, Canberra, Australia (December 2008)Google Scholar
  25. 25.
    Silva, C., Santos-Victor, J.: Direct egomotion estimation. In: Procedings 13th International Conference on Pattern Recognition, vol. 1, pp. 702–706 (1996)Google Scholar
  26. 26.
    Triggs, B., McLauchlan, P.F., Hartley, R.I., Fitzgibbon, A.W.: Bundle adjustment - a modern synthesis. In: Triggs, B., Zisserman, A., Szeliski, R. (eds.) ICCV-WS 1999. LNCS, vol. 1883, p. 298. Springer, Heidelberg (2000)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  1. 1.Department of EngineeringAustralian National UniversityAustralia
  2. 2.CSIRO ICT CentrePullenvaleAustralia

Personalised recommendations