Skip to main content

Designing Eyes-Free Interaction

  • Conference paper
Haptic and Audio Interaction Design (HAID 2007)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 4813))

Included in the following conference series:

Abstract

As the form factors of computational devices diversify, the concept of eyes-free interaction is becoming increasingly relevant: it is no longer hard to imagine use scenarios in which screens are inappropriate. However, there is currently little consensus about this term. It is regularly employed in different contexts and with different intents. One key consequence of this multiplicity of meanings is a lack of easily accessible insights into how to best build an eyes-free system. This paper seeks to address this issue by thoroughly reviewing the literature, proposing a concise definition and presenting a set of design principles. The application of these principles is then elaborated through a case study of the design of an eyes-free motion input system for a wearable device.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Brewster, S., Lumsden, J., Bell, M., Hall, M., Tasker, S.: Multimodal ’eyes-free’ interaction techniques for wearable devices. In: Proc. of CHI 2003, ACM Press, New York (2003)

    Google Scholar 

  2. Cheok, A.D., Ganesh Kumar, K., Prince, S.: Micro-Accelerometer Based Hardware Interfaces for Wearable Computer Mixed Reality Applications. In: Horrocks, I., Hendler, J. (eds.) ISWC 2002. LNCS, vol. 2342, Springer, Heidelberg (2002)

    Google Scholar 

  3. Cho, I., Sunwoo, J., Son, Y., Oh, M., Lee, C.: Development of a Single 3-axis Accelerometer Sensor Based Wearable Gesture Recognition Band. In: Proceedings of Ubiquitous Intelligence and Computing, Hong Kong (2007)

    Google Scholar 

  4. Crease, M.C., Brewster, S.A.: Making progress with sounds: The design and evaluation of an audio progress bar. In: Proc. of ICAD 1998, Glasgow, UK (1998)

    Google Scholar 

  5. Costanza, E., Inverso, S.A., Allen, R., Maes, P.: Intimate interfaces in action: assessing the usability and subtlety of emg-based motionless gestures. In: Proc. of CHI 2007, ACM Press, New York (2007)

    Google Scholar 

  6. Gaver, W.W., Smith, R.B, O’Shea, T.: Effective sounds in complex systems: the ARKOLA simulation. In: Proc. of CHI 1991, ACM Press, New York (1991)

    Google Scholar 

  7. Kallio, S., Kela, J., Mäntyjärvi, J., Plomp, J.: Visualization of hand gestures for pervasive computing environments. In: AVI 2006. Proceedings of the Working Conference on Advanced Visual interfaces, ACM Press, New York (2006)

    Google Scholar 

  8. Kurtenbach, G., Sellen, A., Buxton, W.: An empirical evaluation of some articulatory and cognitive aspects of ”marking menus”. Human Computer Interaction 8(1), 1–23 (1993)

    Article  Google Scholar 

  9. Oakley, I., O’Modhrain, S.: Tilt to Scroll: Evaluating a Motion Based Vibrotactile Mobile Interface. In: Proceedings of World Haptics 2005, Pisa, Italy, IEEE Press, Los Alamitos (2005)

    Google Scholar 

  10. Oakley, I., Park, J.: The Effect of a Distracter Task on the Recognition of Tactile Icons. In: The proceedings of WorldHaptics 2007, Tsukuba, Japan, IEEE Press, IEEE Press (2007)

    Google Scholar 

  11. Oakley, I., Park, J.: A motion-based marking menu system. In: Extended Abstracts of CHI 2007, ACM Press, New York (2007)

    Google Scholar 

  12. Partridge, K., Chatterjee, S., Sazawal, V., Borriello, G., Want, R.: Tilt-Type: Accelerometer-Supported Text Entry for Very Small Devices. In: Proc. of ACM UIST, ACM Press, New York (2002)

    Google Scholar 

  13. Pirhonen, A., Brewster, S.A., Holguin, C.: Gestural and Audio Metaphors as a Means of Control for Mobile Devices. In: Proceedings of CHI 2002, ACM Press, New York (2002)

    Google Scholar 

  14. Poupyrev, I., Maruyama, S., Rekimoto, J.: Ambient touch: designing tactile interfaces for handheld devices. In: Proc. of ACM UIST 2002, ACM Press, New York (2002)

    Google Scholar 

  15. Rekimoto, J.: Gesturewrist and gesturepad: Unobtrusive wearable interaction devices. In: Proc. of ISWC 2001 (2001)

    Google Scholar 

  16. Roto, V., Oulasvirta, A.: Need for non-visual feedback with long response times in mobile HCI. In: proceedings of WWW 2005, ACM Press, New York (2005)

    Google Scholar 

  17. Smyth, T.N., Kirkpatrick, A.E.: A new approach to haptic augmentation of the GUI. In: Proceedings of ICMI 2006, ACM Press, New York (2006)

    Google Scholar 

  18. Sutcliffe, A.: On the effective use and reuse of HCI knowledge. ACM Trans. Comput.-Hum. Interact. 7(2), 197–221 (2000)

    Article  Google Scholar 

  19. Tactaid VBW32, www.tactaid.com/skinstimulator.html

  20. Tan, H.Z., Srinivasan, M.A., Eberman, B., Cheng, B.: Human factors for the design of force-reflecting haptic interfaces. In: Proceedings of ASME Dynamic Systems and Control Division, pp. 353–359. ASME, Chicago, IL (1994)

    Google Scholar 

  21. Watson, M., Sanderson, P.: Sonification Supports Eyes-Free Respiratory Monitoring and Task Time-Sharing. Human Factors 46(3), 497–517 (2004)

    Article  Google Scholar 

  22. Wigdor, D., Balakrishnan, R.: TiltText: Using tilt for text input to mobile phones. In: Proc. of ACM UIST 2003, ACM Press, New York (2003)

    Google Scholar 

  23. Williamson, J., Murray-Smith, R., Hughes, S.: Shoogle: excitatory multimodal interaction on mobile devices. In: Proceedings CHI 2007., ACM Press, New York (2007)

    Google Scholar 

  24. Witt, H., Nicolai, T., Kenn, H.: Designing a Wearable User Interface for Hands-free Interaction in Maintenance Applications. In: Proceedings of IEEE International Conference on Pervasive Computing and Communications, IEEE Computer Society Press, Los Alamitos (2006)

    Google Scholar 

  25. Xsens Motion Technologies, www.xsens.com

  26. Yin, M., Zhai, S.: The benefits of augmenting telephone voice menu navigation with visual browsing and search. In: Proc. of ACM CHI 2006, ACM Press, New York (2006)

    Google Scholar 

  27. Zhao, S., Dragicevic, P., Chignell, M., Balakrishnan, R., Baudisch, P.: Earpod: eyes-free menu selection using touch input and reactive audio feedback. In: Proceedings of CHI 2007, ACM Press, New York (2007)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Ian Oakley Stephen Brewster

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Oakley, I., Park, JS. (2007). Designing Eyes-Free Interaction. In: Oakley, I., Brewster, S. (eds) Haptic and Audio Interaction Design. HAID 2007. Lecture Notes in Computer Science, vol 4813. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-76702-2_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-76702-2_13

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-76701-5

  • Online ISBN: 978-3-540-76702-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics