Enabling Navigation of MAVs through Inertial, Vision, and Air Pressure Sensor Fusion

Chapter
Part of the Lecture Notes in Electrical Engineering book series (LNEE, volume 35)

Abstract

Traditional methods used for navigating miniature unmanned aerial vehicles (MAVs) consist of fusion between Global Positioning System (GPS) and Inertial Measurement Unit (IMU) information. However, many of the flight scenarios envisioned for MAVs (in urban terrain, indoors, in hostile (jammed) environments, etc.) are not conducive to utilizing GPS. Navigation in GPS-denied areas can be performed using an IMU only. However, the size, weight, and power constraints of MAVs severely limits the quality of IMUs that can be placed on-board the MAVs, making IMU-only navigation extremely inaccurate. In this paper, we introduce a Kalman filter based system for fusing information from two additional sensors (an electro-optical camera and differential air pressure sensor) with the IMU to improve the navigation abilities of the MAV. We discuss some important implementation issues that must be addressed when fusing information from these sensors together. Results demonstrate an improvement of at least 10x in final position and attitude accuracy using the system proposed in this paper.

Keywords

Vision-aided navigation GPS-denied narrigation Sensor fusion 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Beard R, Kingston D, Quigley M, Snyder D, Christiansen R, Johnson W, McLain T, and Goodrich M (2005) Autonomous vehicle technologies for small fixed wing UAVs. AIAA Journal of Aerospace Computing, Information, and Communication 2(1): 92–108.Google Scholar
  2. 2.
    Kingston DB and Beard RW (2004) Real-time attitude and position estimation for small uav’s using low-cost sensors. In: AIAA 3rd Unmanned Unlimited Systems Conference and Workshop, Chicago, ILGoogle Scholar
  3. 3.
    Procerus Technologies (2008) Procerus technologies URL http://www.procerusuav.com
  4. 4.
    Micropilot (2008) Micropilot, URL http://www.micropilot.com
  5. 5.
    Strelow D and Singh S (2004) Motion Estimation from Image and Inertial Measurements. The International Journal of Robotics Research 23(12): 1157–1195.CrossRefGoogle Scholar
  6. 6.
    Veth MJ, Raquet JF, and Pachter M (2006) Stochastic constraints for efficient image correspondence search. IEEE Transactions on Aerospace and Electronic Systems 42(3): 973–982.CrossRefGoogle Scholar
  7. 7.
    Veth M and Raquet J (2007) Fusing low-cost image and inertial sensors for passive navigation. Journal of the Institute of Navigation 54(1): 11–20.Google Scholar
  8. 8.
    Langelaan J (2006) State estimation for autonomous flight in cluttered environments. PhD thesis, Stanford University.Google Scholar
  9. 9.
    Prazenica R, Watkins A, Kurdila A, Ke Q, and Kanade T (2005) Vision-based kalman filtering for aircraft state estimation and structure from motion. In: 2005 AIAA Guidance, Navigation, and Control Conference and Exhibit, pp. 1–13.Google Scholar
  10. 10.
    Soatto S, Frezza R, and Perona P (1996) Motion estimation via dynamic vision. IEEE Transactions on Automatic Control 41(3): 393–413.MATHCrossRefMathSciNetGoogle Scholar
  11. 11.
    Ready BB and Taylor CN (2007) Improving accuracy of mav pose estimation using visual odometry. In: American Control Conference, 2007. ACC ’07, pp. 3721–3726.Google Scholar
  12. 12.
    Andersen ED and Taylor CN (2007) Improving mav pose estimation using visual information. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp.3745–3750.Google Scholar
  13. 13.
    Sorenson HW (1966) Kalman filtering techniques. In: Leondes CT (ed) Advances in Control Systems, Theory and Applications, vol 3, pp. 218–292.Google Scholar
  14. 14.
    Xing Z and Gebre-Egziabher D (2008) Modeling and bounding low cost inertial sensor errors. In: Proceedings of IEEE/ION Position Location and Navigation Symposium, pp. 1122–1132.Google Scholar
  15. 15.
    IEEE Std 952 (1997) IEEE Standard specification formate guide and test procedures for single-axis interferometric fiber optic gyros. IEEE Standard 952–1997.Google Scholar
  16. 16.
    Titterton D and Weston J. (1997) Strapdown Inertial Navigation Technology. Lavenham, United Kingdom: Peter Peregrinus Ltd.Google Scholar
  17. 17.
    Cloudcap (2008) Cloud cap technology. URL http://www.cloudcaptech.com

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  1. 1.Department of Electrical and Computer EngineeringBrigham Young UniversityProvoUT 84602

Personalised recommendations