Skip to main content
Log in

Vision Based Position Control for MAVs Using One Single Circular Landmark

  • Published:
Journal of Intelligent & Robotic Systems Aims and scope Submit manuscript

Abstract

This paper presents a real-time vision based algorithm for 5 degrees-of-freedom pose estimation and set-point control for a Micro Aerial Vehicle (MAV). The camera is mounted on-board a quadrotor helicopter. Camera pose estimation is based on the appearance of two concentric circles which are used as landmark. We show that that by using a calibrated camera, conic sections, and the assumption that yaw is controlled independently, it is possible to determine the six degrees-of-freedom pose of the MAV. First we show how to detect the landmark in the image frame. Then we present a geometric approach for camera pose estimation from the elliptic appearance of a circle in perspective projection. Using this information we are able to determine the pose of the vehicle. Finally, given a set point in the image frame we are able to control the quadrotor such that the feature appears in the respective target position. The performance of the proposed method is presented through experimental results.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Blösch, M., Weiss, S., Scaramuzza, D., Siegwart, R.: Vision based mav navigation in unknown and unstructured environments. In: IEEE International Conference on Robotics and Automation (ICRA’10), Anchorage, 2010 (2010)

  2. Fischler, M.A., Bolles, R.C.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 24(6), 381–395 (1981)

    Article  MathSciNet  Google Scholar 

  3. Hamel, T., Mahony, R., Chriette, A.: Visual servo trajectory tracking for a four rotor vtol aerial vehicle. In: International Conference on Robotics and Automation (2002)

  4. Cheviron, T., Hamel, T., Mahony, R., Baldwin, G.: Robust nonlinear fusion of inertial and visual data for position, velocity and attitude estimation of uav. In: International Conference on Robotics and Automation (2007)

  5. Dhome, M., Richetin, M., Laprest, J., Rives, G.: Determination of the attitude of 3d objects from a single perspective view. IEEE Trans. Pattern Anal. Mach. Intell. 11(12), 1265–1278 (1989)

    Article  Google Scholar 

  6. Liu, Y., Huang, T.S., Faugeras, O.D.: Determination of camera location from 2-d to 3-d line and point correspondences. IEEE Trans. Pattern Anal. Mach. Intell. 12(1), 28–37 (1990)

    Article  Google Scholar 

  7. Ansar, A., Daniilidis, K.: Linear pose estimation from points or lines. IEEE Trans. Pattern Anal. Mach. Intell. 25(5), 578–589 (2003)

    Article  Google Scholar 

  8. Hartley, R., Zisserman, A.: Multiple View Geometry in Computer Vision. Cambridge University Press, New York (2000)

    MATH  Google Scholar 

  9. Wang, G., Tsui, H.-T., Hu, Z., Wu, F.: Camera calibration and 3d reconstruction from a single view based on scene constraints. Image Vis. Comput. 23(3), 311–323 (2005)

    Article  Google Scholar 

  10. Jiang, G., Quan, L.: Detection of concentric circles for camera calibration. In: Tenth IEEE International Conference on Computer Vision, 2005. ICCV 2005, vol. 1, pp. 333–340 (2005)

  11. Kim, J.-S., Gurdjos, P., Kweon, I.-S.: Geometric and algebraic constraints of projected concentric circles and their applications to camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 27(4), 637–642 (2005)

    Article  Google Scholar 

  12. Hua, X.M., Li, H., Hu, Z.: A new easy camera calibration technique based on circular points. In: TABLE 3 Input Images, Edge Map, and the Estimated Calibration Matrix, pp. 1155–1164. Press (2000)

  13. Wang, G., Wu, J., Ji, Z.: Single view based pose estimation from circle or parallel lines. Pattern Recogn. Lett. 29(7), 977–985 (2008)

    Article  Google Scholar 

  14. Chaumette, F.: Image moments: A general and useful set of features for visual servoing. IEEE Trans. Robot. 20(4), 713–723 (2004)

    Article  Google Scholar 

  15. Lee, R., Lu, P.-C., Tsai, W.-H.: Moment preserving detection of elliptical shapes in gray-scale images. Pattern Recogn. Lett. 11(6), 405–414 (1990)

    Article  MATH  Google Scholar 

  16. Scaramuzza, D., Martinelli, A., Siegwart, R.: A toolbox for easy calibrating omnidirectional cameras. In: Proc. of the IEEE International Conference on Intelligent Robots and Systems (IROS) (2006)

  17. Scaramuzza, D.: Omnidirectional camera calibration toolbox for matlab, first release 2006, last update 2009, google for “ocamcalib” (2009)

  18. Halir, R., Flusser, J.: Numerically stable direct least squares fitting of ellipses (1998)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Davide Scaramuzza.

Additional information

The research leading to these results has received funding from the European Community’s Seventh Framework Programme (FP7/2007-2013) under grant agreement n. 231855 (sFly: http://www.sfly.org). Daniel Eberli is currently Master student at the ETH Zurich. Davide Scaramuzza is currently senior researcher and team leader at the ETH Zurich. Stephan Weiss is currently PhD student at the ETH Zurich. Roland Siegwart is full professor at the ETH Zurich and head of the Autonomous Systems Lab.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Eberli, D., Scaramuzza, D., Weiss, S. et al. Vision Based Position Control for MAVs Using One Single Circular Landmark. J Intell Robot Syst 61, 495–512 (2011). https://doi.org/10.1007/s10846-010-9494-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10846-010-9494-8

Keywords

Navigation