An Onboard Monocular Vision System for Autonomous Takeoff, Hovering and Landing of a Micro Aerial Vehicle
In this paper, we present an onboard monocular vision system for autonomous takeoff, hovering and landing of a Micro Aerial Vehicle (MAV). Since pose information with metric scale is critical for autonomous flight of a MAV, we present a novel solution to six degrees of freedom (DOF) pose estimation. It is based on a single image of a typical landing pad which consists of the letter “H” surrounded by a circle. A vision algorithm for robust and real-time landing pad recognition is implemented. Then the 5 DOF pose is estimated from the elliptic projection of the circle by using projective geometry. The remaining geometric ambiguity is resolved by incorporating the gravity vector estimated by the inertial measurement unit (IMU). The last degree of freedom pose, yaw angle of the MAV, is estimated from the ellipse fitted from the letter “H”. The efficiency of the presented vision system is demonstrated comprehensively by comparing it to ground truth data provided by a tracking system and by using its pose estimates as control inputs to autonomous flights of a quadrotor.
KeywordsMonocular vision Onboard Vision Micro Aerial Vehicle Landing Ellipse Projective geometry
Unable to display preview. Download preview PDF.
- 1.Bouguet, J.Y.: Camera Calibration Toolbox for Matlab. http://www.vision.caltech.edu/bouguetj/calib_doc. Accessed 2001
- 4.Chen, Q., Wu, H., Wada, T.: Camera calibration with two arbitrary coplanar circles. In: ECCV-2004, LNCS, vol. 3023/2004, pp. 521–532 (2004)Google Scholar
- 6.Diebel, J.: Representing attitude: Euler angles, unit quaternions, and rotation vectors. Technical report, Stanford University, Stanford, California, 94301-9010 (2006)Google Scholar
- 8.Faugeras, O.: Three-Dimensional Computer Vision: A Geometric Viewpoint. MIT Press (1993)Google Scholar
- 13.Kanatani, K., Wu, L.: 3D interpretation of conics and orthogonality. Image Underst. 58, 286–301 (1993)Google Scholar
- 14.Lange, S., Sünderhauf, N., Protzel, P.: A vision based onboard approach for landing and position control of an autonomous multirotor UAV in GPS-denied environments. In: Proceedings 2009 International Conference on Advanced Robotics, pp. 1–6. Munich (2009)Google Scholar
- 15.Meier, L., Tanskanen, P., Fraundorfer, F., Pollefeys, M.: PIXHAWK: a system for autonomous flight using onboard computer vision. In: Proceedings 2011 IEEE International Conference on Robotics and Automation, pp. 2992–2997. Shanghai (2011)Google Scholar
- 18.Michael, N., Mellinger, D., Lindsey, Q., Kumar, V.: The GRASP multiple micro UAV testbed. Robot. Auton. Syst. 17(3), 56–65 (2010)Google Scholar
- 19.Bradski, G.: The OpenCV Library. Dr. Dobb’s Journal of Software Tools (2000)Google Scholar
- 21.Scherer, S.A., Dube, D., Komma, P., Masselli, A., Zell, A.: Robust real-time number sign detection on a mobile outdoor robot. In: Proceedings of the 6th European Conference on Mobile Robots (ECMR 2011). Orebro, Sweden (2011)Google Scholar
- 22.Wenzel, K.E., Rosset, P., Zell, A.: Low-cost visual tracking of a landing place and hovering flight control with a microcontroller. J. Intell. Robot. Syst. 57(1–4), 297–311 (2009)Google Scholar
- 25.Zell, A., Mache, N., Hübner, R., Mamier, G., Vogt, M., Schmalzl, M., Herrmann, K.-U.: SNNS (stuttgart neural network simulator). In: Skrzypek, J. (ed.) Neural Network Simulation Environments. The Springer International Series in Engineering and Computer Science, vol. 254, chapter 9. Kluwer Academic, Norwell (1994)Google Scholar