Gaze and Speech in Attentive User Interfaces
The trend toward pervasive computing necessitates finding and implementing appropriate ways for users to interact with devices. We believe the future of interaction with pervasive devices lies in attentive user interfaces, systems that pay attention to what users do so that they can attend to what users need. Such systems track user behavior, model user interests, and anticipate user desires and actions. In addition to developing technologies that support attentive user interfaces, and applications or scenarios that use attentive user interfaces, there is the problem of evaluating the utility of the attentive approach. With this last point in mind, we observed users in an “office of the future”, where information is accessed on displays via verbal commands. Based on users’ verbal data and eye-gaze patterns, our results suggest people naturally address individual devices rather than the office as a whole.
Unable to display preview. Download preview PDF.
- 1.Bolt, R. A. (1980). Put that there: Voice and gesture at the graphics interface. ACM Computer Graphics vn14 (3), 262–270.Google Scholar
- 2.Coen, M. H. (1998). Design principles for intelligent environments, In Proceedings of the Fifteenth National Conference on Artificial Intelligence (AAAI’ 98). Madison, WI.Google Scholar
- 4.Horvitz, E. Breese, J., Heckerman, D., Hovel, D., &. Rommelse, K. (1998). The Lumiere project: Bayesian user modeling for inferring the goals and needs of software users, in Proceedings of the Fourteenth Conference on Uncertainty in Artificial Intelligence.Google Scholar
- 5.Jacob, R. J. K. (1993). Eye movement-based human-computer interaction techniques: Toward non-command interfaces. In Hartson, D. & Hix, (Eds.)., Advances in Human-Computer Interaction, Vol 4, pp. 151–180. Ablex: NorwoodGoogle Scholar
- 6.Linton, F., Joy, D., & Schaefer, H. (1999). Building user and expert models by long-term observation of application usage, in Proceedings of the Seventh International Conference on User Modeling, 129–138.Google Scholar
- 7.Maglio, P. P, Barrett, R., Campbell, C. S., Selker, T. (2000) Suitor: An attentive information system, in Proceedings of the International Conference on Intelligent User Interfaces 2000.Google Scholar
- 8.Maglio, P. P. & Campbell, C. S. (2000). Tradeoffs in displaying peripheral information, in Proceedings of the Conference on Human Factors in Computing Systems (CHI 2000).Google Scholar
- 9.Norman, D. A. (1998). The invisible computer. Cambridge, MA: MIT Press.Google Scholar
- 12.Starker, I. & Bolt, R. A. A gaze-responsive self-disclosing display, in Proceedings of the Conference on Human Factors in Computing Systems, CHI’ 90, 1990, 3–9.Google Scholar
- 14.Zhai, S., Morimoto, C., & Ihde, S. (1999). Manual and gaze input cascaded (MAGIC) pointing, in Proceedings of the Conference on Human Factors in Computing Systems (CHI 1999), 246–253.Google Scholar