Skip to main content
Log in

Development and Evaluation of a Chase View for UAV Operations in Cluttered Environments

  • Published:
Journal of Intelligent and Robotic Systems Aims and scope Submit manuscript

Abstract

Civilian applications for UAVs will bring these vehicles into low flying areas cluttered with obstacles such as building, trees, power lines, and more importantly civilians. The high accident rate of UAVs means that civilian use will come at a huge risk unless we design systems and protocols that can prevent UAV accidents, better train operators and augment pilot performance. This paper presents two methods for generating a chase view to the pilot for UAV operations in cluttered environments. The chase view gives the operator a virtual view from behind the UAV during flight. This is done by generating a virtual representation of the vehicle and surrounding environment while integrating it with the real-time onboard camera images. Method I presents a real-time mapping approach toward generating the surrounding environment and Method II uses a prior model of the operating environment. Experimental results are presented from tests where subjects flew in a H0 scale environment using a 6 DOF gantry system. Results showed that the chase view improved UAV operator performance over using the traditional onboard camera view.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Endlesy, M.: Design and evaluation for situation awareness enhancements. In: Proceedings of the Human Factors Society 32nd Annual Meeting, pp. 97–101 (1988)

  2. Oh, P.Y., Valavanis, K., Woods, R.: Uav workshop on civilian applications and commercial opportunities. (2008)

  3. Weibel, R.E., Hansman, R.J.: Safety considerations for operation of unmanned aerial vehicles in the national airspace system. Tech. Rep. ICAT-2005-1, MIT International Center for Air Transportation (2005)

  4. Defense, D.o.: Unmanned aircraft systems roadmap 2005-2030. Tech. rep. (2005)

  5. Murphy, R.: Human-robot interaction in rescue robotics. IEEE Trans. Syst. Man Cybern. 34(2), 138–153 (2004)

    Article  MathSciNet  Google Scholar 

  6. Hing, J.T., Oh, P.Y.: Development of an unmanned aerial vehicle piloting system with integrated motion cueing for training and pilot evaluation. J. Intell. Robot. Syst. 54, 3–19 (2009)

    Article  Google Scholar 

  7. Williams, K.W.: A summary of unmanned aircraft accident/incident data: Human factors implications. Tech. Rep. DOT/FAA/AM-04/24, US Department of Transportation Federal Aviation Administration, Office of Aerospace Medicine (2004)

  8. Calhoun, G., Draper, M.H., Ruff, H.A., Fontejon, J.V.: Utility of a tactile display for cueing faults. In: Proceedings of the Human Factors and Ergonomics Society 46th Annual Meeting, pp. 2144–2148 (2002)

  9. Ruff, H.A., Draper, M.H., Poole, M., Repperger, D.: Haptic feedback as a supplemental method of altering uav operators to the onset of turbulence. In: Proceedings of the IEA 2000/ HFES 2000 Congress, pp. 3.14–3.44 (2000)

  10. Little, K.: Raytheon announces revolutionary new ‘cockpit’ for unmanned aircraft—an industry first (2006)

  11. Tadema, J., Koeners, J., Theunissen, E.: Synthetic vision to augment sensor-based vision for remotely piloted vehicles. In: Enhanced and Synthetic Vision, vol. 6226, pp. 62260D–1–10. SPIE-Int. Soc. Opt. Eng. (2006)

  12. Sugimoto, M., Kagotani, G., Nii, H., Shiroma, N., Matsuno, F., Inami, M.: Time follower’s vision: a teleoperation interface with past images. IEEE Comput. Graph. Appl. 25(1), 54–63 (2005)

    Article  Google Scholar 

  13. Nielsen, C.W., Goodrich, M.A., Ricks, R.W.: Ecological interfaces for improving mobile robot teleoperation. IEEE Trans. Robot. 23(5), 927–941 (2007)

    Article  Google Scholar 

  14. Drury, J.L., Richer, J., Rackliffe, N., Goodrich, M.A.: Comparing situation awareness for two unmanned aerial vehicle human interface approaches. Tech. rep., Defense Technical Information Center OAI-PMH Repository. http://stinet.dtic.mil/oai/oai (United States) (2006)

  15. Quigley, M., Goodrich, M. A., Beard, R.: Semi-autonomous human-uav interfaces for fixed-wing mini-uavs. 28 September–2 October 2004

  16. Webb, T.P., Prazenica, R.J., Kurdila, A.J., Lind, R.: Vision-based state estimation for autonomous micro air vehicles. J. Guid. Control Dyn. 30(3), 816–826 (2007)

    Article  Google Scholar 

  17. Prazenica, R.J., Watkins, A.S., Kurdila, A.J., Ke, Q.F., Kandae, T.: Vision-based kalman filtering for aircraft state estimation and structure from motion. In: AIAA Guidance, Navigation, and Control Conference, vol. v 3, pp. 1748–1760. American Institute of Aeronautics and Astronautics, Reston (2005)

  18. Shi, J., Tomasi, C.: Good features to track. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 593–600. IEEE, Piscataway (1994)

    Google Scholar 

  19. Bouguet, J.Y.: Pyramidal implementation of the lucas kanade feature tracker: description of the algorithm. Tech. rep., Intel Corporation Microprocessor Research Labs (2002)

  20. Watkins, A.S., Kehoe, J.J., Lind, R.: Slam for flight through urban environments using dimensionality reduction. In: AIAA Guidance, Navigation, and Control Conference, vol. v 8, pp. 5018–5029. American Institute of Aeronautics and Astronautics, Reston (2006)

  21. Narli, V., Oh, P.Y.: Hardware-in-the-loop test rig to capture aerial robot and sensor suite performance metrics, p. 2006. In: IEEE International Conference on Intelligent Robots and Systems (2006)

  22. Meyer, A.: X-plane by laminar resarch. www.x-plane.com (2009)

  23. Ernst, D., Valavanis, K., Garcia, R., Craighead, J.: Unmanned vehicle controller design, evaluation and implementation: from matlab to printed circuit board. J. Intell. Robot. Syst. 49, 85–108 (2007)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Paul Y. Oh.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Hing, J.T., Sevcik, K.W. & Oh, P.Y. Development and Evaluation of a Chase View for UAV Operations in Cluttered Environments. J Intell Robot Syst 57, 485–503 (2010). https://doi.org/10.1007/s10846-009-9356-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10846-009-9356-4

Keywords

Navigation