Implicit Interaction in Multimodal Human-Machine Systems

  • Matthias Rötting
  • Thorsten Zander
  • Sandra Trösterer
  • Jeronimo Dzaack
Conference paper

Imagine you click on a file on your computer by mistake. The computer processes the information and starts to open the corresponding application. But this takes some time. You immediately recognize your mistake and prepare to close the application right after it opens to continue your intended task. You feel distracted and helpless and your feelings are accompanied by facial expressions and inner thoughts. How would it be if the technical system could understand your mistake by analyzing selective information of you, the user? Like humans do in face-to-face communication, the system would recognize your mistake almost as soon as you did and adapt accordingly.


Fixation Duration Technical System Mental Workload Implicit Information Statistical Machine Learning 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Blankertz B, Curio G, Mueller KR (2002) Classifying single trial EEG: Towards brain computer interfacing. In Advances in Neural Information Processing Systems (NIPS 01), 14: 157–164Google Scholar
  2. Blankertz B, Dornhege G, Krauledat M, Müller KR, Curio G (2007) The non-invasive Berlin brain-computer interface: Fast acquisition of effective performance in untrained subjects. NeuroImage, 37: 539–550.CrossRefGoogle Scholar
  3. Blankertz B, Dornhege G, Schafer C, Krepki R, Kohlmorgen J, Müller KR, Kunzmann V, Losch F, Curio G (2003) Boosting bit rates and error detection for the classification of fast-paced motor commands based on single-trial EEG analysis. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 11: 127–131.CrossRefGoogle Scholar
  4. Cutrell E, Tan D (2008) BCI for passive input in HCI. Proceedings of the CHI-Conference, Florence, Italy.Google Scholar
  5. Duchowski AT (2007) Eye Tracking Methodology: Theory and Practice. Second Edition. Springer, London.Google Scholar
  6. Ferrez PW, Millan JR (2007) Error-related EEG potentials in brain-computer interfaces. Towards Brain-Computer Interfacing. The MIT Press, 291–301.Google Scholar
  7. Gaertner M, Klister T, Zander TO (2008) Classifying the observation of feasible and unfeasible human motion. Proceedings of the 4th Int. BCI Workshop & Training Course, No. 4, 2008.Google Scholar
  8. Gerathewohl SJ, Brown EL, Burke JE, Kimball KA, Lowe WF, Strackhouse SP (1978) Inflight measurement of pilot workload: A panel discussion. Aviation, Space, and Environmental Medicine, 810–822.Google Scholar
  9. Hansen JP, Johansen AS, Donegan M, MacKay JC, Cowans P, Kühn M, Bates R, Majaranta P, Räihä KJ (2005) D4.1 Design Specifications anf guidelines for COGAIN eye-typing systems. Communication by Gaze Interaction (COGAIN). Available online via
  10. Hyrskykaria A (2006) Eyes in Attentive Interfaces: Experiences from Creating iDict, a Gaze-Aware Reading Aid. PhD thesis, University Of Tampere, Finland Scholar
  11. Istance HO, Spinner C, Howarth PA (1996) Eye-based Control of Standard GUI Software. In: Sasse MA, Cunningham J, Winder RL (Eds.) Proceedings of HCI on People and Computers XI. Springer, London, 141–158.Google Scholar
  12. Jacob RJK, Karn KS (2003) Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises (Section Commentary). In: Hyöna J, Radach R, Deubel H (Eds.) The Mind's Eye: Cognitive and Applied Aspects of Eye Movement Research. Elsevier Science, Amsterdam, 573–605.Google Scholar
  13. Jatzev S, Zander TO, DeFlilippis M, Kothe C, Welke S, Roetting M (2008) Examining causes for non-stationarities: The loss of controllability is a factor which induces nonstationarities. Proceedings of the 4th Int. BCI Workshop & Training Course, No. 4, 2008.Google Scholar
  14. Just MA, Carpenter PA (1980) A Theory of Reading: From Eye Fixations to Comprehension. Psychological Review, 87: 329–354.CrossRefGoogle Scholar
  15. Kahnemann D (1973) Attention and effort. Prentice Hall, Englewood Cliffs.Google Scholar
  16. Kumar M (2007) Gaze-enhanced User Interface Design. Dissertation submitted to Stanford University for the degree of Doctor of Philosophy.Google Scholar
  17. MacKay DJC, Ward DJ (2002) Fast Hands-free Writing by Gaze Direction. Nature, 418: 838.CrossRefGoogle Scholar
  18. Majaranta P, Räihä K (2002) Twenty Years of Eye Typing: Systems and Design Issues. Eye Tracking Research & Applications (ETRA) Symposium 2002, ACM, 15–22.Google Scholar
  19. Marr D (1982) Vision. W.H. Freeman, New York.Google Scholar
  20. Parasuraman R, Sheridan TB, Wickens CD (2000) A model for types and levels of human interaction with automation. IEEE Transactions on Systems, Man and Cybernetics, Part A, 30: 286–297.CrossRefGoogle Scholar
  21. Rachovides D, Walkerdine J, Phillips P (2007) The conductor interaction method. ACM 3, 4: 1–23.Google Scholar
  22. Rötting M (2001) Parametersystematik der Augen- und Blickbewegungen für arbeitswissenschaftliche Untersuchungen. Shaker, Aachen.Google Scholar
  23. Sarter NB, Woods DD, Billings CE (1997) Automation surprises. Handbook of Human Factors and Ergonomics, 2: 1926–1943.Google Scholar
  24. Timpe KP, Kolrep H (2002) Das Mensch-Maschine-System als interdisziplinärer Gegenstand. In: Timpe KP, Jürgensohn T, Kolrep H (Eds.) Mensch-Maschine Systemtechnik – Konzepte, Modellierung, Gestaltung, Evaluation. Symposion Publishing, Düsseldorf, 9–40.Google Scholar
  25. Trösterer S, Dzaack J (2007) Cognitive workload as a means to distract top-down visual search in dynamically changing interfaces? ECEM 2007, 14th European Conference on Eye Movements, August 19–23, Potsdam, Germany, 44.Google Scholar
  26. Unema P, Rötting M, Luczak H (1988) Eye Movements and Mental Workload in Man-Vehicle Interaction. In: Adams AS, Hall RR, McPhee BJ, Oxenburgh MS (Eds.) Ergonomics International 88, Proceedings of the Tenth Congress of the International Ergonomics Association, 1–5 August 1988. Ergonomics Society of Australia Inc., Sydney, 463–465.Google Scholar
  27. Vidal JJ (1973) Toward direct brain-computer communication. Annual Reviews in Biophysics and Bioengineering, 2: 157–180.CrossRefGoogle Scholar
  28. Wilcox T, Evans M, Pearce C, Pollard N, Sundstedt V (2008) Gaze and voice based game interaction: the revenge of the killer penguins. ACM SIGGRAPH 2008 Posters. Los Angeles, Ca.Google Scholar
  29. Wolfe JM (1994) Guided search 2.0: A revised model of visual search. Psychonomic Bulletin and Review, 1: 202–238.Google Scholar
  30. Wolpaw JR, Birbaumer N, McFarland DJ, Pfurtscheller G, Vaughan TM (2002) Brain-computer interfaces for communication and control. Clinical neurophysiology, 113: 767–791.CrossRefGoogle Scholar
  31. Yarbus AL (1967) Eye movements and vision. Plenum Press, New York.Google Scholar
  32. Zander TO, Kothe C, Jatzev S, Dashuber R, Welke S, de Fillippis M, Roetting M (2008) Team PhyPA: Developing applications for brain-computer interaction. Proceedings of the CHI-Conference, Florence, Italy.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Matthias Rötting
    • 1
  • Thorsten Zander
    • 1
  • Sandra Trösterer
    • 1
  • Jeronimo Dzaack
    • 1
  1. 1.Chair of Human-Machine Systems, Berlin University of TechnologyBerlinGermany

Personalised recommendations