Development and Evaluation of a Chase View for UAV Operations in Cluttered Environments

  • James T. Hing
  • Keith W. Sevcik
  • Paul Y. OhEmail author


Civilian applications for UAVs will bring these vehicles into low flying areas cluttered with obstacles such as building, trees, power lines, and more importantly civilians. The high accident rate of UAVs means that civilian use will come at a huge risk unless we design systems and protocols that can prevent UAV accidents, better train operators and augment pilot performance. This paper presents two methods for generating a chase view to the pilot for UAV operations in cluttered environments. The chase view gives the operator a virtual view from behind the UAV during flight. This is done by generating a virtual representation of the vehicle and surrounding environment while integrating it with the real-time onboard camera images. Method I presents a real-time mapping approach toward generating the surrounding environment and Method II uses a prior model of the operating environment. Experimental results are presented from tests where subjects flew in a H0 scale environment using a 6 DOF gantry system. Results showed that the chase view improved UAV operator performance over using the traditional onboard camera view.


UAV safety UAV accidents UAV training 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Endlesy, M.: Design and evaluation for situation awareness enhancements. In: Proceedings of the Human Factors Society 32nd Annual Meeting, pp. 97–101 (1988)Google Scholar
  2. 2.
    Oh, P.Y., Valavanis, K., Woods, R.: Uav workshop on civilian applications and commercial opportunities. (2008)Google Scholar
  3. 3.
    Weibel, R.E., Hansman, R.J.: Safety considerations for operation of unmanned aerial vehicles in the national airspace system. Tech. Rep. ICAT-2005-1, MIT International Center for Air Transportation (2005)Google Scholar
  4. 4.
    Defense, D.o.: Unmanned aircraft systems roadmap 2005–2030. Tech. rep. (2005)Google Scholar
  5. 5.
    Murphy, R.: Human-robot interaction in rescue robotics. IEEE Trans. Syst. Man Cybern. 34(2), 138–153 (2004)CrossRefMathSciNetGoogle Scholar
  6. 6.
    Hing, J.T., Oh, P.Y.: Development of an unmanned aerial vehicle piloting system with integrated motion cueing for training and pilot evaluation. J. Intell. Robot. Syst. 54, 3–19 (2009)CrossRefGoogle Scholar
  7. 7.
    Williams, K.W.: A summary of unmanned aircraft accident/incident data: Human factors implications. Tech. Rep. DOT/FAA/AM-04/24, US Department of Transportation Federal Aviation Administration, Office of Aerospace Medicine (2004)Google Scholar
  8. 8.
    Calhoun, G., Draper, M.H., Ruff, H.A., Fontejon, J.V.: Utility of a tactile display for cueing faults. In: Proceedings of the Human Factors and Ergonomics Society 46th Annual Meeting, pp. 2144–2148 (2002)Google Scholar
  9. 9.
    Ruff, H.A., Draper, M.H., Poole, M., Repperger, D.: Haptic feedback as a supplemental method of altering uav operators to the onset of turbulence. In: Proceedings of the IEA 2000/HFES 2000 Congress, pp. 3.14–3.44 (2000)Google Scholar
  10. 10.
    Little, K.: Raytheon announces revolutionary new ‘cockpit’ for unmanned aircraft—an industry first (2006)Google Scholar
  11. 11.
    Tadema, J., Koeners, J., Theunissen, E.: Synthetic vision to augment sensor-based vision for remotely piloted vehicles. In: Enhanced and Synthetic Vision, vol. 6226, pp. 62260D?1-10. SPIEInt. Soc. Opt. Eng. (2006)Google Scholar
  12. 12.
    Sugimoto, M., Kagotani, G., Nii, H., Shiroma, N., Matsuno, F., Inami, M.: Time follower’s vision: a teleoperation interface with past images. IEEE Comput. Graph. Appl. 25(1), 54–63 (2005)CrossRefGoogle Scholar
  13. 13.
    Nielsen, C.W., Goodrich, M.A., Ricks, R.W.: Ecological interfaces for improving mobile robot teleoperation. IEEE Trans. Robot. 23(5), 927–941 (2007)CrossRefGoogle Scholar
  14. 14.
    Drury, J.L., Richer, J., Rackliffe, N., Goodrich, M.A.: Comparing situation awareness for two unmanned aerial vehicle human interface approaches. Tech. rep., Defense Technical Information Center OAI-PMH Repository. (United States) (2006)Google Scholar
  15. 15.
    Quigley, M., Goodrich, M. A., Beard, R.: Semi-autonomous human-uav interfaces for fixed-wing mini-uavs. 28 September–2 October 2004Google Scholar
  16. 16.
    Webb, T.P., Prazenica, R.J., Kurdila, A.J., Lind, R.: Vision-based state estimation for autonomous micro air vehicles. J. Guid. Control Dyn. 30(3), 816–826 (2007)CrossRefGoogle Scholar
  17. 17.
    Prazenica, R.J., Watkins, A.S., Kurdila, A.J., Ke, Q.F., Kandae, T.: Vision-based kalman filtering for aircraft state estimation and structure from motion. In: AIAA Guidance, Navigation, and Control Conference, vol. v 3, pp. 1748–1760. American Institute of Aeronautics and Astronautics, Reston (2005)Google Scholar
  18. 18.
    Shi, J., Tomasi, C.: Good features to track. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 593–600. IEEE, Piscataway (1994)Google Scholar
  19. 19.
    Bouguet, J.Y.: Pyramidal implementation of the lucas kanade feature tracker: description of the algorithm. Tech. rep., Intel Corporation Microprocessor Research Labs (2002)Google Scholar
  20. 20.
    Watkins, A.S., Kehoe, J.J., Lind, R.: Slam for flight through urban environments using dimensionality reduction. In: AIAA Guidance, Navigation, and Control Conference, vol. v 8, pp. 5018–5029. American Institute of Aeronautics and Astronautics, Reston (2006)Google Scholar
  21. 21.
    Narli, V., Oh, P.Y.: Hardware-in-the-loop test rig to capture aerial robot and sensor suite performance metrics, p. 2006. In: IEEE International Conference on Intelligent Robots and Systems (2006)Google Scholar
  22. 22.
    Meyer, A.: X-plane by laminar resarch. (2009)Google Scholar
  23. 23.
    Ernst, D., Valavanis, K., Garcia, R., Craighead, J.: Unmanned vehicle controller design, evaluation and implementation: from matlab to printed circuit board. J. Intell. Robot. Syst. 49, 85–108 (2007)CrossRefGoogle Scholar

Copyright information

© Springer Science + Business Media B.V. 2009

Authors and Affiliations

  1. 1.Department of Mechanical Engineering and MechanicsDrexel UniversityPhiladelphiaUSA

Personalised recommendations