Advertisement

Designing Eyes-Free Interaction

  • Ian Oakley
  • Jun-Seok Park
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4813)

Abstract

As the form factors of computational devices diversify, the concept of eyes-free interaction is becoming increasingly relevant: it is no longer hard to imagine use scenarios in which screens are inappropriate. However, there is currently little consensus about this term. It is regularly employed in different contexts and with different intents. One key consequence of this multiplicity of meanings is a lack of easily accessible insights into how to best build an eyes-free system. This paper seeks to address this issue by thoroughly reviewing the literature, proposing a concise definition and presenting a set of design principles. The application of these principles is then elaborated through a case study of the design of an eyes-free motion input system for a wearable device.

Keywords

Eyes-free interaction design principles motion input 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Brewster, S., Lumsden, J., Bell, M., Hall, M., Tasker, S.: Multimodal ’eyes-free’ interaction techniques for wearable devices. In: Proc. of CHI 2003, ACM Press, New York (2003)Google Scholar
  2. 2.
    Cheok, A.D., Ganesh Kumar, K., Prince, S.: Micro-Accelerometer Based Hardware Interfaces for Wearable Computer Mixed Reality Applications. In: Horrocks, I., Hendler, J. (eds.) ISWC 2002. LNCS, vol. 2342, Springer, Heidelberg (2002)Google Scholar
  3. 3.
    Cho, I., Sunwoo, J., Son, Y., Oh, M., Lee, C.: Development of a Single 3-axis Accelerometer Sensor Based Wearable Gesture Recognition Band. In: Proceedings of Ubiquitous Intelligence and Computing, Hong Kong (2007)Google Scholar
  4. 4.
    Crease, M.C., Brewster, S.A.: Making progress with sounds: The design and evaluation of an audio progress bar. In: Proc. of ICAD 1998, Glasgow, UK (1998)Google Scholar
  5. 5.
    Costanza, E., Inverso, S.A., Allen, R., Maes, P.: Intimate interfaces in action: assessing the usability and subtlety of emg-based motionless gestures. In: Proc. of CHI 2007, ACM Press, New York (2007)Google Scholar
  6. 6.
    Gaver, W.W., Smith, R.B, O’Shea, T.: Effective sounds in complex systems: the ARKOLA simulation. In: Proc. of CHI 1991, ACM Press, New York (1991)Google Scholar
  7. 7.
    Kallio, S., Kela, J., Mäntyjärvi, J., Plomp, J.: Visualization of hand gestures for pervasive computing environments. In: AVI 2006. Proceedings of the Working Conference on Advanced Visual interfaces, ACM Press, New York (2006)Google Scholar
  8. 8.
    Kurtenbach, G., Sellen, A., Buxton, W.: An empirical evaluation of some articulatory and cognitive aspects of ”marking menus”. Human Computer Interaction 8(1), 1–23 (1993)CrossRefGoogle Scholar
  9. 9.
    Oakley, I., O’Modhrain, S.: Tilt to Scroll: Evaluating a Motion Based Vibrotactile Mobile Interface. In: Proceedings of World Haptics 2005, Pisa, Italy, IEEE Press, Los Alamitos (2005)Google Scholar
  10. 10.
    Oakley, I., Park, J.: The Effect of a Distracter Task on the Recognition of Tactile Icons. In: The proceedings of WorldHaptics 2007, Tsukuba, Japan, IEEE Press, IEEE Press (2007)Google Scholar
  11. 11.
    Oakley, I., Park, J.: A motion-based marking menu system. In: Extended Abstracts of CHI 2007, ACM Press, New York (2007)Google Scholar
  12. 12.
    Partridge, K., Chatterjee, S., Sazawal, V., Borriello, G., Want, R.: Tilt-Type: Accelerometer-Supported Text Entry for Very Small Devices. In: Proc. of ACM UIST, ACM Press, New York (2002)Google Scholar
  13. 13.
    Pirhonen, A., Brewster, S.A., Holguin, C.: Gestural and Audio Metaphors as a Means of Control for Mobile Devices. In: Proceedings of CHI 2002, ACM Press, New York (2002)Google Scholar
  14. 14.
    Poupyrev, I., Maruyama, S., Rekimoto, J.: Ambient touch: designing tactile interfaces for handheld devices. In: Proc. of ACM UIST 2002, ACM Press, New York (2002)Google Scholar
  15. 15.
    Rekimoto, J.: Gesturewrist and gesturepad: Unobtrusive wearable interaction devices. In: Proc. of ISWC 2001 (2001)Google Scholar
  16. 16.
    Roto, V., Oulasvirta, A.: Need for non-visual feedback with long response times in mobile HCI. In: proceedings of WWW 2005, ACM Press, New York (2005)Google Scholar
  17. 17.
    Smyth, T.N., Kirkpatrick, A.E.: A new approach to haptic augmentation of the GUI. In: Proceedings of ICMI 2006, ACM Press, New York (2006)Google Scholar
  18. 18.
    Sutcliffe, A.: On the effective use and reuse of HCI knowledge. ACM Trans. Comput.-Hum. Interact. 7(2), 197–221 (2000)CrossRefGoogle Scholar
  19. 19.
  20. 20.
    Tan, H.Z., Srinivasan, M.A., Eberman, B., Cheng, B.: Human factors for the design of force-reflecting haptic interfaces. In: Proceedings of ASME Dynamic Systems and Control Division, pp. 353–359. ASME, Chicago, IL (1994)Google Scholar
  21. 21.
    Watson, M., Sanderson, P.: Sonification Supports Eyes-Free Respiratory Monitoring and Task Time-Sharing. Human Factors 46(3), 497–517 (2004)CrossRefGoogle Scholar
  22. 22.
    Wigdor, D., Balakrishnan, R.: TiltText: Using tilt for text input to mobile phones. In: Proc. of ACM UIST 2003, ACM Press, New York (2003)Google Scholar
  23. 23.
    Williamson, J., Murray-Smith, R., Hughes, S.: Shoogle: excitatory multimodal interaction on mobile devices. In: Proceedings CHI 2007., ACM Press, New York (2007)Google Scholar
  24. 24.
    Witt, H., Nicolai, T., Kenn, H.: Designing a Wearable User Interface for Hands-free Interaction in Maintenance Applications. In: Proceedings of IEEE International Conference on Pervasive Computing and Communications, IEEE Computer Society Press, Los Alamitos (2006)Google Scholar
  25. 25.
    Xsens Motion Technologies, www.xsens.com
  26. 26.
    Yin, M., Zhai, S.: The benefits of augmenting telephone voice menu navigation with visual browsing and search. In: Proc. of ACM CHI 2006, ACM Press, New York (2006)Google Scholar
  27. 27.
    Zhao, S., Dragicevic, P., Chignell, M., Balakrishnan, R., Baudisch, P.: Earpod: eyes-free menu selection using touch input and reactive audio feedback. In: Proceedings of CHI 2007, ACM Press, New York (2007)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Ian Oakley
    • 1
  • Jun-Seok Park
    • 1
  1. 1.POST-PC Research Group, Electronics and Telecommunication Research Institute, 161 Gajeong-dong, Yuseong-gu, Daejeon, 305-700Republic of Korea

Personalised recommendations