Abstract
Whenever people talk to each other, non-verbal behaviour plays a very important role in regulating their interaction. However, almost all human-computer interactions take place using a keyboard or mouse – computers are completely oblivious to the non-verbal behaviour of their users. This paper outlines the plan for an interface that aims to adapt like a human to the non-verbal behaviour of users. An Intelligent Tutoring System (ITS) for counting and addition is being implemented in conjunction with the New Zealand Numeracy Project. The system’s interface will detect the student’s non-verbal behaviour using in-house image processing software, enabling it to adapt to the student’s non-verbal behaviour in similar ways to a human tutor. We have conducted a video study of how human tutors interpret the non-verbal behaviour of students, which has laid the foundation for this research.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Alexander, S.T.V., Sarrafzadeh, A., Hughes, P., Fan, C.: How do Human Tutors of Counting and Addition Interpret Non-verbal Behaviour? In: (submitted to) ITS 2004 (2004)
Ekman, P., Friesen, W.V.: Facial Action Coding System. Consulting Psychologists Press, Palo Atlo (1978)
Fan, C., Johnson, M., Messom, C., Sarrafzadeh, A.: Machine Vision for an Intelligent Tutor. In: Proceedings of the International Conference on Computational Intelligence, Robotics and Autonomous Systems, Singapore (2003)
Kort, B., Reilly, R., Picard, R.W.: An Affective Model of Interplay Between Emotions and Learning: Reengineering Educational Pedagogy - Building a Learning Companion. In: IEEE International Conference on Advanced Learning Technologies, pp. 43–48 (2001)
Lisetti, C.L., Nasoz, F., Lerouge, C., Ozyer, O., Alvarez, K.: Developing Multimodal Intelligent Affective Interfaces for Tele-Home Health Care. International Journal of Human-Computer Studies 59(1-2), 245–255 (2003)
Litman, D., Forbes, K.: Recognizing Emotions from Student Speech in Tutoring Dialogues. In: Proceedings of the IEEE Automatic Speech Recognition and Understanding Workshop (ASRU), St. Thomas, Virgin Island (2003)
Nasoz, F., Lisetti, C.L., Alvarez, K., Finelstein, N.: Emotional Recognition from Physiological Signals for User Modeling of Affect. In: Proceedings of the 3rd Workshop on Affective and Attitude User Modeling, Pittsburgh (2003)
New Zealand Ministry of Education: Book 1, The Number Framework. Numeracy Professional Development Projects, Ministry of Education,Wellington (2003)
Pantic, M., Rothkrantz, L.J.M.: Toward an affect-sensitive multimodal human-computer interaction. Proceedings of the IEEE 91(9), 1370–1390 (2003)
Picard, R.W.: Affective Computing. MIT Press, Cambridge (1997)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Alexander, S., Sarrafzadeh, A. (2004). Interfaces That Adapt like Humans. In: Masoodian, M., Jones, S., Rogers, B. (eds) Computer Human Interaction. APCHI 2004. Lecture Notes in Computer Science, vol 3101. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-27795-8_70
Download citation
DOI: https://doi.org/10.1007/978-3-540-27795-8_70
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-22312-2
Online ISBN: 978-3-540-27795-8
eBook Packages: Springer Book Archive