Advertisement

Aircraft Pilot Intention Recognition for Advanced Cockpit Assistance Systems

  • Stefan SuckEmail author
  • Florian Fortmann
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9744)

Abstract

Present aircraft are highly automated systems. In general, automation improved aviation safety significantly. However, automation exhibits itself in many forms of adverse behaviors related to human factors problems. A major finding is that insufficient support of partnership between the pilot crew and the aircraft automation can result in conflicting intentions. The European project A-PiMod (Applying Pilot Models for Safer Aircraft) addresses issues of conventional automation in the aviation domain. The overall objective of the project is to foster pilot crew-automation partnership on the basis of a novel architecture for cooperative automation. An essential part of the architecture is an intention recognition module. The intention recognition module employs a Hidden Markov Model (HMM) to infer the most probable current intention of the human pilots. The HMM is trained and evaluated with data containing interactions of human pilots with the aircraft cockpit systems. The data was obtained during experiments with human pilots in a flight simulator.

Keywords

Aircraft crew Intention recognition Markov Model 

Notes

Acknowledgment

The A-PiMod project is funded by the European Commission Seventh Framework Programme (FP7/2007-2013) under contract number: 605141 Project A-PiMod. Thanks also to my colleague Mark Eilers for his advice during the creation of this work.

References

  1. 1.
    Applying pilot models for safer aircraft. http://www.apimod.eu
  2. 2.
    Caldwell, J.A.: Crew schedules, sleep deprivation, and aviation performance. Curr. Dir. Psychol. Sci. 21(2), 85–89 (2012)CrossRefGoogle Scholar
  3. 3.
    Cohen, P.R., Levesque, H.J.: Intention is choice with commitment. Artif. Intell. 42, 213–261 (1990)MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Donath, D.: Verhaltensanalyse der Beanspruchung des Operateurs in der Multi-UAV-Führung. Dissertation, Universität der Bundeswehr München (2012)Google Scholar
  5. 5.
    Doshi, A., Trivedi, M.M.: Tactical driver behavior prediction and intent inference: a review. In: 2011 14th International IEEE Conference on Intelligent Transportation Systems (ITSC), pp. 1892–1897. IEEE (2011)Google Scholar
  6. 6.
    Flemisch, F., Meier, S., Neuhöfer, J., Baltzer, M., Altendorf, E., Özyurt, E.: Kognitive und kooperative systeme in der fahrzeugführung: selektiver rückblick über die letzten dekaden und spekulation über die zukunft. Kognitive Systeme (2013)Google Scholar
  7. 7.
    Hayashi, M.: Hidden Markov Models for analysis of pilot instrument scanning and attention switching. Ph.D. thesis, Massachusetts Institute of Technology (2004)Google Scholar
  8. 8.
    Javaux, D., Fortmann, F., Möhlenbrink, C.: Adaptive human-automation cooperation: a general architecture for the cockpit and its application in the a-pimod project. In: 7th International Conference on Advanced Cognitive Technologies and Applications (COGNITIVE 2015). International Academy, Research, and Industry Association (IARIA) (2015)Google Scholar
  9. 9.
    Landwehr, N.: Modeling interleaved hidden processes. In: Cohen, W.W., McCallum, A., Roweis, S.T. (eds.) ICML. ACM International Conference Proceeding Series, vol. 307, pp. 520–527. ACM (2008)Google Scholar
  10. 10.
    Lüdtke, A., Weber, L., Osterloh, J.-P., Wortelen, B.: Modeling pilot and driver behavior for human error simulation. In: Duffy, V.G. (ed.) Digital Human Modeling, HCII 2009. LNCS, vol. 5620, pp. 403–412. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  11. 11.
    Manzey, D., Reichenbach, J., Onnasch, L.: Human performance consequences of automated decision aids: the impact of degree of automation and system experience. J. Cogn. Eng. Decis. Making 6, 57–87 (2012)CrossRefGoogle Scholar
  12. 12.
    McGee, J.P., Mavor, A.S., Wickens, C.D., et al.: Flight to the Future: Human Factors in Air Traffic Control. National Academies Press, Washington (1997)Google Scholar
  13. 13.
    Moebus, C., Eilers, M.: Prototyping smart assistance with bayesian autonomous driver models. In: Chong, N.Y., Mastrogiovanni, F. (eds.) Handbook of Research on Ambient Intelligence and Smart Environments: Trends and Perspectives, pp. 460–512. IGI Global, May 2011Google Scholar
  14. 14.
    Mosier, K.L., Skitka, L.J., Heers, S., Burdick, M.: Automation bias: decision making and performance in high-tech cockpits. Int. J. Aviat. Psychol. 8(1), 47–63 (1998)CrossRefGoogle Scholar
  15. 15.
    Orasanu, J., Martin, L., Davison, J.: Errors in Aviation Decision Making: Bad Decisions or Bad Luck? National Aeronautics and Space Administration, Ames Research Center (1998)Google Scholar
  16. 16.
    Rabiner, L.R.: A tutorial on hidden markov models and selected applications in speech recognition. Proc. IEEE 77, 257–286 (1989)CrossRefGoogle Scholar
  17. 17.
    Singla, G., Cook, D.J.: Interleaved activity recognition for smart home residents. In: Proceedings of the 5th International Conference on Intelligent Environments, IE 2009, Barcelona, Spain, pp. 145–152 (2009)Google Scholar
  18. 18.
    Sogame, H., Ladkin, P.: Aircraft accident investigation report 96-5. Japan: Ministry of transport (1996). http://sunnyday.mit.edu/accidents/nag-1.html
  19. 19.
    Strohal, M., Onken, R.: Intent and error recognition as part of a knowledge-based cockpit assistant. In: Proceedings of the SPIE, vol. 3390, pp. 287–299 (1998)Google Scholar
  20. 20.
    Wood, S.: Flight crew reliance on automation. CAA Paper 10 (2004)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.OFFIS Institute for Information TechnologyOldenburgGermany

Personalised recommendations