Gaze-based interactions in the cockpit of the future: a survey


Flying an aircraft is a mentally demanding task where pilots must process a vast amount of visual, auditory and vestibular information. They have to control the aircraft by pulling, pushing and turning different knobs and levers, while knowing that mistakes in doing so can have fatal outcomes. Therefore, attempts to improve and optimize these interactions should not increase pilots’ mental workload. By utilizing pilots’ visual attention, gaze-based interactions provide an unobtrusive solution to this. This research is the first to actively involve pilots in the exploration of gaze-based interactions in the cockpit. By distributing a survey among 20 active commercial aviation pilots working for an internationally operating airline, the paper investigates pilots’ perception and needs concerning gaze-based interactions. The results build the foundation for future research, because they not only reflect pilots’ attitudes towards this novel technology, but also provide an overview of situations in which pilots need gaze-based interactions.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10


  1. 1.

    Airbus (2018) Orders and deliveries. Technical report, Airbus. Accessed 15 July 2019

  2. 2.

    Alonso R, Causse M, Vachon F, Parise R, Dehais F, Terrier P (2013) Evaluation of head-free eye tracking as an input device for air traffic control. Ergonomics 56(2):246–255.

    Article  Google Scholar 

  3. 3.

    Anagnostopoulos VA, Havlena M, Kiefer P, Giannopoulos I, Schindler K, Raubal M (2017) Gaze-informed location-based services. Int J Geogr Inf Sci 31(9):1770–1797.

    Article  Google Scholar 

  4. 4.

    Anders G (2001) Pilot’s attention allocation during approach and landing-eye- and head-tracking research in an a 330 full flight simulator. In: International symposium on aviation psychology (ISAP). Accessed 15 July 2019

  5. 5.

    Arthur JTJ III, Bailey RE, Williams SP, Prinzel LJ III, Shelton KJ, Jones DR, Houston V (2015) A review of head-worn display research at NASA Langley Research Center, vol 9470, pp 94700W1–94700W15.

  6. 6.

    Bednarik R, Vrzakova H, Hradis M (2012) What do you want to do next: a novel approach for intent prediction in gaze-based interaction. In: Proceedings of the symposium on eye tracking research and applications, ACM, New York, NY, USA, ETRA ’12, pp 83–90.

  7. 7.

    Bellenkes AH, Wickens CD, Kramer AF (1997) Visual scanning and pilot expertise: the role of attentional flexibility and mental model development. Aviat Space Environ Med 68(7):569–579

    Google Scholar 

  8. 8.

    Boeing A (2016) Statistical summary of commercial jet airplane accidents; worldwide operations—1959–2016. Technical report, Boeing. Accessed 15 July 2019

  9. 9.

    Bulling A, Duchowski AT, Majaranta P (2011) Petmei 2011: the 1st international workshop on pervasive eye tracking and mobile eye-based interaction. In: Proceedings of the 13th international conference on ubiquitous computing, ACM, New York, NY, USA, UbiComp ’11, pp 627–628.

  10. 10.

    Curtis MT, Jentsch F, Wise JA (2010) Chapter 14: aviation displays. In: Salas E, Maurino D (eds) Human factors in aviation, 2nd edn. Academic Press, San Diego, pp 439–478.

    Google Scholar 

  11. 11.

    Dehais F, Causse M, Pastor J (2008) Embedded eye tracker in a real aircraft: new perspectives on pilot/aircraft interaction monitoring. In: Proceedings from The 3rd international conference on research in air transportation. Federal Aviation Administration, Fairfax, USA

  12. 12.

    Dehais F, Peysakhovich V, Scannella S, Fongue J, Gateau T (2015) Automation surprise in aviation: real-time solutions. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems, ACM, New York, NY, USA, CHI ’15, pp 2525–2534.

  13. 13.

    Dehais F, Behrend J, Peysakhovich V, Causse M, Wickens CD (2017) Pilot flying and pilot monitoring’s aircraft state awareness during go-around execution in aviation: a behavioral and eye tracking study. Int J Aerosp Psychol 27(1–2):15–28.

    Article  Google Scholar 

  14. 14.

    DeMattia N (2019) Could BMW’s gesture control learn from the new Mercedes-Benz A-Class? Last accessed 29 April 2019

  15. 15.

    Duchowski AT (2017) Eye tracking methodology: theory and practice, vol 328, 3rd edn. Springer, London.

    Google Scholar 

  16. 16.

    Ellis KKE (2009) Eye tracking metrics for workload estimation in flight deck operations. Master’s thesis, University of Iowa. Accessed 15 July 2019

  17. 17.

    Endsley MR (2009) Chapter 12: situation awareness in aviation systems. In: Wise JA, Hopkin VD, Garland DJ (eds) Handbook of aviation human factors, 2nd edn. CRC Press Inc, Boca Raton, pp 12/1–12/22

    Google Scholar 

  18. 18.

    Foyle DC, Andre AD, Hooey BL (2005) Situation awareness in an augmented reality cockpit: design, viewpoints and cognitive glue. In: Proceedings of the 11th international conference on human computer interaction, vol 1, pp 3–9

  19. 19.

    Giannopoulos I, Kiefer P, Raubal M (2012) Geogazemarks: providing gaze history for the orientation on small display maps. In: Proceedings of the 14th ACM international conference on multimodal interaction, ACM, New York, NY, USA, ICMI ’12, pp 165–172.

  20. 20.

    Hansen JP, Lund H, Biermann F, Møllenbach E, Sztuk S, Agustin JS (2016) Wrist-worn pervasive gaze interaction. In: Proceedings of the ninth Biennial ACM symposium on eye tracking research and applications, ACM, New York, NY, USA, ETRA ’16, pp 57–64.

  21. 21.

    Hart SG (2006) NASA-task load index (NASA-TLX); 20 years later. Proc Hum Factors Ergon Soc Annu Meet 50(9):904–908.

    Article  Google Scholar 

  22. 22.

    Hollomon MJ, Kratchounova D, Newton DC, Gildea K, Knecht WR (2017) Current status of gaze control research and technology literature. Technical report, Federal Aviation Administration. Accessed 15 July 2019

  23. 23.

    Hsieh HF, Shannon SE (2005) Three approaches to qualitative content analysis. Qual Health Res 15(9):1277–1288.

    Article  Google Scholar 

  24. 24.

    Imbert JP, Hurter C, Peysakhovich V, Blättler C, Dehais F, Camachon C (2015) Design requirements to integrate eye trackers in simulation environments: aeronautical use case. In: Neves-Silva R, Jain LC, Howlett RJ (eds) Intelligent decision technologies. Springer, Cham, pp 231–241

    Google Scholar 

  25. 25.

    Ishimaru S, Dingler T, Kunze K, Kise K, Dengel A (2016) Reading interventions: tracking reading state and designing interventions. In: Proceedings of the 2016 ACM international joint conference on pervasive and ubiquitous computing: adjunct, ACM, New York, NY, USA, UbiComp ’16, pp 1759–1764.

  26. 26.

    Jacob RJK (1990) What you look at is what you get: eye movement-based interaction techniques. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, New York, NY, USA, CHI ’90, pp 11–18.

  27. 27.

    Johnson CW (2004) Looking beyond the cockpit: human computer interaction in the causal complexes of aviation accidents, NY, USA, New York, pp 17–24

  28. 28.

    Kangas J, Akkil D, Rantala J, Isokoski P, Majaranta P, Raisamo R (2014) Gaze gestures and haptic feedback in mobile devices. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, New York, NY, USA, CHI ’14, pp 435–438.

  29. 29.

    Kanki BG, Helmreich RL, Wiener EL (2010) Crew resource management, 2nd edn. Academic Press, San Diego

    Google Scholar 

  30. 30.

    Landry SJ (2012) Chapter 33: human–computer interaction in aerospace. In: Jacko JA (ed) Human–computer interaction handbook: fundamentals, evolving technologies, and emerging applications, 3rd edn. CRC Press Inc, Boca Raton, pp 771–793.

    Google Scholar 

  31. 31.

    Lefrancois O, Matton N, Gourinat Y, Peysakhovich V, Causse M (2016) The role of pilots’ monitoring strategies in flight performance. In: European association for aviation psychology conference EAAP32, Cascais, PT, pp 1–11. Accessed 15 July 2019

  32. 32.

    Liggett K (2009) Controls, displays, and crew station design. CRC Press, Boca Raton, pp 15–1–15–36.

    Google Scholar 

  33. 33.

    MacKenzie SI (2013) Human–computer interaction: an empirical research perspective, 1st edn. Morgan Kaufmann Publishers Inc., San Francisco

    Google Scholar 

  34. 34.

    Majaranta P, Ahola UK, Špakov O (2009) Fast gaze typing with an adjustable dwell time. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, New York, NY, USA, CHI ’09, pp 357–360.

  35. 35.

    Merchant S, Schnell T (2000) Applying eye tracking as an alternative approach for activation of controls and functions in aircraft. In: 19th DASC. 19th digital avionics systems conference. Proceedings (Cat. No.00CH37126), IEEE, vol 2, pp 5A5/1–5A5/9.

  36. 36.

    Merchant S, Schnell T (2001) Eye movement research in aviation and commercially available eye trackers today. Master’s thesis, University of Iowa, USA

  37. 37.

    Mitchell J, Naidoo P, Vermeulen LP (2009) Flying glass: a qualitative analysis of pilot perceptions of automated flight-decks after 20 years. Int J Appl Aviat Stud 9(1):13–28

    Google Scholar 

  38. 38.

    Pauchet S, Letondal C, Vinot JL, Causse M, Cousy M, Becquet V, Crouzet G (2018) Gazeform: dynamic gaze-adaptive touch surface for eyes-free interaction in airliner cockpits. In: Proceedings of the 2018 designing interactive systems conference, ACM, New York, NY, USA, DIS ’18, pp 1193–1205.

  39. 39.

    Peysakhovich V, Lefrançois O, Dehais F, Causse M (2018) The neuroergonomics of aircraft cockpits: the four stages of eye-tracking integration to enhance flight safety. Safety.

    Article  Google Scholar 

  40. 40.

    Previc FH, Lopez N, Ercoline WR, Daluz CM, Workman AJ, Evans RH, Dillon NA (2009) The effects of sleep deprivation on flight performance, instrument scanning, and physiological arousal in pilots. Int J Aviat Psychol 19(4):326–346.

    Article  Google Scholar 

  41. 41.

    Rudi D, Kiefer P, Raubal M (2018) Visualizing pilot eye movements for flight instructors. In: Proceedings of the 3rd workshop on eye tracking and visualization, ACM, New York, NY, USA, ETVIS ’18, pp 7:1–7:5.

  42. 42.

    Sarter NB, Mumaw RJ, Wickens CD (2007) Pilots’ monitoring strategies and performance on automated flight decks: an empirical study combining behavioral and eye-tracking data. Hum Factors 49(3):347–357.

    Article  Google Scholar 

  43. 43.

    Schnell T, Kwon Y, Merchant S, Etherington T (2004) Improved flight technical performance in flight decks equipped with synthetic vision information system displays. Int J Aviat Psychol 14(1):79–102.

    Article  Google Scholar 

  44. 44.

    Schriver AT, Morrow DG, Wickens CD, Talleur DA (2008) Expertise differences in attentional strategies related to pilot decision making. Hum Factors 50(6):864–878.

    Article  Google Scholar 

  45. 45.

    (SmartEye) SA (2018) Accessed 15 July 2019

  46. 46.

    Thomas P, Biswas P, Langdon P (2015) State-of-the-art and future concepts for interaction in aircraft cockpits. In: Antona M, Stephanidis C (eds) Universal access in human–computer interaction. Access to interaction. Springer, Cham, pp 538–549.

    Google Scholar 

  47. 47.

    Vidal M, Bulling A, Gellersen H (2013) Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In: Proceedings of the 2013 ACM international joint conference on pervasive and ubiquitous computing, ACM, New York, NY, USA, UbiComp ’13, pp 439–448.

  48. 48.

    Vidulich MA, Wickens CD, Tsang PS, Flach JM (2010) Chapter 7: information processing in aviation. In: Salas E, Maurino D (eds) Human factors in aviation, 2nd edn. Academic Press, San Diego, pp 175–215.

    Google Scholar 

  49. 49.

    Wickens CD (2008) Aviation. Wiley, New York, pp 361–389.

    Google Scholar 

  50. 50.

    Wickens CD, Fadden S, Merwin D, Ververs PM (1998) Cognitive factors in aviation display design. In: Digital avionics systems conference, 1998. Proceedings, 17th DASC. The AIAA/IEEE/SAE, IEEE, vol 1, pp E32/1–E32/8.

  51. 51.

    Zhang Y, Bulling A, Gellersen H (2013) Sideways: a gaze interface for spontaneous interaction with situated displays. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, New York, NY, USA, CHI ’13, pp 851–860.

  52. 52.

    Ziv G (2016) Gaze behavior and visual attention: a review of eye tracking studies in aviation. Int J Aviat Psychol 26(3–4):75–104.

    Article  Google Scholar 

Download references


First we would like to thank Michel Kölla and Benedikt Wagner for their support throughout the design, execution and evaluation of this study. We also thank all our participants, Christoph Ammann and SWISS International Air Lines Ltd. for their collaboration.


This study was partially funded by the Federal Office of Civil Aviation (FOCA) Switzerland under the Grant Number: 2014-142.

Author information



Corresponding author

Correspondence to David Rudi.

Ethics declarations

Conflict of Interest

The authors declare that they have no conflict of interest.

Informed consent

Study approved by ETH ethics committee (EK 2016-N-31). Participation was voluntary. Informed consents exist.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Rudi, D., Kiefer, P., Giannopoulos, I. et al. Gaze-based interactions in the cockpit of the future: a survey. J Multimodal User Interfaces 14, 25–48 (2020).

Download citation


  • Eye tracking
  • Gaze-based interactions
  • Aviation
  • Survey