Civilian applications for UAVs will bring these vehicles into low flying areas cluttered with obstacles such as building, trees, power lines, and more importantly civilians. The high accident rate of UAVs means that civilian use will come at a huge risk unless we design systems and protocols that can prevent UAV accidents, better train operators and augment pilot performance. This paper presents two methods for generating a chase view to the pilot for UAV operations in cluttered environments. The chase view gives the operator a virtual view from behind the UAV during flight. This is done by generating a virtual representation of the vehicle and surrounding environment while integrating it with the real-time onboard camera images. Method I presents a real-time mapping approach toward generating the surrounding environment and Method II uses a prior model of the operating environment. Experimental results are presented from tests where subjects flew in a H0 scale environment using a 6 DOF gantry system. Results showed that the chase view improved UAV operator performance over using the traditional onboard camera view.
This is a preview of subscription content, log in to check access.
Buy single article
Instant access to the full article PDF.
Price includes VAT for USA
Subscribe to journal
Immediate online access to all issues from 2019. Subscription will auto renew annually.
This is the net price. Taxes to be calculated in checkout.
Endlesy, M.: Design and evaluation for situation awareness enhancements. In: Proceedings of the Human Factors Society 32nd Annual Meeting, pp. 97–101 (1988)
Oh, P.Y., Valavanis, K., Woods, R.: Uav workshop on civilian applications and commercial opportunities. (2008)
Weibel, R.E., Hansman, R.J.: Safety considerations for operation of unmanned aerial vehicles in the national airspace system. Tech. Rep. ICAT-2005-1, MIT International Center for Air Transportation (2005)
Defense, D.o.: Unmanned aircraft systems roadmap 2005-2030. Tech. rep. (2005)
Murphy, R.: Human-robot interaction in rescue robotics. IEEE Trans. Syst. Man Cybern. 34(2), 138–153 (2004)
Hing, J.T., Oh, P.Y.: Development of an unmanned aerial vehicle piloting system with integrated motion cueing for training and pilot evaluation. J. Intell. Robot. Syst. 54, 3–19 (2009)
Williams, K.W.: A summary of unmanned aircraft accident/incident data: Human factors implications. Tech. Rep. DOT/FAA/AM-04/24, US Department of Transportation Federal Aviation Administration, Office of Aerospace Medicine (2004)
Calhoun, G., Draper, M.H., Ruff, H.A., Fontejon, J.V.: Utility of a tactile display for cueing faults. In: Proceedings of the Human Factors and Ergonomics Society 46th Annual Meeting, pp. 2144–2148 (2002)
Ruff, H.A., Draper, M.H., Poole, M., Repperger, D.: Haptic feedback as a supplemental method of altering uav operators to the onset of turbulence. In: Proceedings of the IEA 2000/ HFES 2000 Congress, pp. 3.14–3.44 (2000)
Little, K.: Raytheon announces revolutionary new ‘cockpit’ for unmanned aircraft—an industry first (2006)
Tadema, J., Koeners, J., Theunissen, E.: Synthetic vision to augment sensor-based vision for remotely piloted vehicles. In: Enhanced and Synthetic Vision, vol. 6226, pp. 62260D–1–10. SPIE-Int. Soc. Opt. Eng. (2006)
Sugimoto, M., Kagotani, G., Nii, H., Shiroma, N., Matsuno, F., Inami, M.: Time follower’s vision: a teleoperation interface with past images. IEEE Comput. Graph. Appl. 25(1), 54–63 (2005)
Nielsen, C.W., Goodrich, M.A., Ricks, R.W.: Ecological interfaces for improving mobile robot teleoperation. IEEE Trans. Robot. 23(5), 927–941 (2007)
Drury, J.L., Richer, J., Rackliffe, N., Goodrich, M.A.: Comparing situation awareness for two unmanned aerial vehicle human interface approaches. Tech. rep., Defense Technical Information Center OAI-PMH Repository. http://stinet.dtic.mil/oai/oai (United States) (2006)
Quigley, M., Goodrich, M. A., Beard, R.: Semi-autonomous human-uav interfaces for fixed-wing mini-uavs. 28 September–2 October 2004
Webb, T.P., Prazenica, R.J., Kurdila, A.J., Lind, R.: Vision-based state estimation for autonomous micro air vehicles. J. Guid. Control Dyn. 30(3), 816–826 (2007)
Prazenica, R.J., Watkins, A.S., Kurdila, A.J., Ke, Q.F., Kandae, T.: Vision-based kalman filtering for aircraft state estimation and structure from motion. In: AIAA Guidance, Navigation, and Control Conference, vol. v 3, pp. 1748–1760. American Institute of Aeronautics and Astronautics, Reston (2005)
Shi, J., Tomasi, C.: Good features to track. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 593–600. IEEE, Piscataway (1994)
Bouguet, J.Y.: Pyramidal implementation of the lucas kanade feature tracker: description of the algorithm. Tech. rep., Intel Corporation Microprocessor Research Labs (2002)
Watkins, A.S., Kehoe, J.J., Lind, R.: Slam for flight through urban environments using dimensionality reduction. In: AIAA Guidance, Navigation, and Control Conference, vol. v 8, pp. 5018–5029. American Institute of Aeronautics and Astronautics, Reston (2006)
Narli, V., Oh, P.Y.: Hardware-in-the-loop test rig to capture aerial robot and sensor suite performance metrics, p. 2006. In: IEEE International Conference on Intelligent Robots and Systems (2006)
Meyer, A.: X-plane by laminar resarch. www.x-plane.com (2009)
Ernst, D., Valavanis, K., Garcia, R., Craighead, J.: Unmanned vehicle controller design, evaluation and implementation: from matlab to printed circuit board. J. Intell. Robot. Syst. 49, 85–108 (2007)
About this article
Cite this article
Hing, J.T., Sevcik, K.W. & Oh, P.Y. Development and Evaluation of a Chase View for UAV Operations in Cluttered Environments. J Intell Robot Syst 57, 485 (2010). https://doi.org/10.1007/s10846-009-9356-4
- UAV safety
- UAV accidents
- UAV training