Universal Access in the Information Society

, Volume 13, Issue 3, pp 291–301 | Cite as

Mobile text-entry and visual demands: reusing and optimizing current solutions

  • Hugo Nicolau
  • Tiago Guerreiro
  • David Lucas
  • Joaquim Jorge
Long paper

Abstract

Mobile devices are increasingly used for text-entry in contexts where visual attention is fragmented and graphical information is inadequate, yet the current solutions to typing on virtual keyboards make it a visually demanding task. This work looks at assistive technologies and interface attributes as tools to ease the task. Two within-subject experiments were performed with 23 and 17 participants, respectively. The first experiment aimed to understand how walking affected text-entry performance and additionally to assess how effective assistive technologies can be in mobile contexts. In the second experiment, adaptive keyboards featuring character prediction and pre-attentive attributes to ease visual demands of text-entry interfaces were developed and evaluated. It has been found that both text-input speed and overall quality are affected in mobile situations. Contrary to the expectations, assistive technologies proved ineffective with visual feedback. The second experiment showed that pre-attentive attributes do not affect users’ performance in task-entry tasks, even though a 3.3–4.3 % decrease in error rates was measured. It was found that users reduce walking speed to compensate for challenges placed by mobile text-entry. Caution should be exercised when transferring assistive technologies to mobile contexts, since they need adaptations to address mobile users’ needs. Also, while pre-attentive attributes seemingly have no effect on experienced QWERTY typists’ performance, they showed promise for both novice users and typists in attention-demanding contexts.

Keywords

Mobile Text-entry Pre-attentive Assistive technology 

Notes

Acknowledgments

This work was supported by FCT through PIDDAC Program funds. Nicolau and Guerreiro were supported by FCT Grants SFRH/BD/46748/2008 and SFRH/BD/28110/2006.

References

  1. 1.
    Faraj, A.L., Mojahid M., Vigouroux N.: Bigkey: a virtual keyboard for mobile devices. In: Human-computer interaction. Ambient, ubiquitous and intelligent interaction, pp. 3–10 (2009)Google Scholar
  2. 2.
    Barnard, L., Yi, J., Jacko, J., Sears, A.: Capturing the effects of context on human performance in mobile computing systems. Pers. Ubiquit. Comput. 11(2), 81–96 (2007)CrossRefGoogle Scholar
  3. 3.
    Bergstrom-Lehtovirta, J., Oulasvirta, A., Brewster, S.: The effects of walking speed on target acquisition on a touch screen interface. In: Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, pp. 143–146, ACM (2011)Google Scholar
  4. 4.
    Bonner, M., Brudvik, J., Abowd, G., Edwards, W.: No-look notes: Accessible eyes-free multi-touch text entry. In: Pervasive Computing, pp. 409–426 (2010)Google Scholar
  5. 5.
    Brewster, S., Lumsden, J., Bell, M., Hall, M., Tasker, S.: Multimodal ‘eyes-free’ interaction techniques for wearable devices. In: CHI’03: Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 473–480. ACM, New York, NY, USA (2003)Google Scholar
  6. 6.
    Guerreiro, T., Lagoá, P., Nicolau, H., Gonçalves, D., Jorge, J.: From tapping to touching: making touch screens accessible to blind users. IEEE Multimedia 15(4), 48–50 (2008)CrossRefGoogle Scholar
  7. 7.
    Hudson, S., Harrison, C., Harrison, B., LaMarca, A.: Whack gestures: inexact and inattentive interaction with mobile devices. In: Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction, pp. 109–112, ACM (2010)Google Scholar
  8. 8.
    Kane, S., Bigham, J., Wobbrock, J.: Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques. In: Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility, pp. 73–80, ACM (2008)Google Scholar
  9. 9.
    Kristoffersen, S., Ljungberg, F.: Making place to make IT work: empirical explorations of HCI for mobile CSCW. In: Proceedings of the international ACM SIGGROUP conference on Supporting group work, pp. 276–285. ACM, New York, NY, USA (1999)Google Scholar
  10. 10.
    Li, K. A., Baudisch, P., Hinckley, K.: Blindsight: eyes-free access to mobile phones. In: CHI’08: Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, pp. 1389–1398. ACM, New York, NY, USA (2008)Google Scholar
  11. 11.
    Lin, M., Goldman, R., Price, K., Sears, A., Jacko, J.: How do people tap when walking? An empirical investigation of nomadic data entry. Int. J. Hum. Comput. Stud. 65(9), 759–769 (2007)CrossRefGoogle Scholar
  12. 12.
    Liu, X., Crump, M., Logan, G.: Do you know where your fingers have been? explicit knowledge of the spatial layout of the keyboard in skilled typists. Mem. Cognit. 38(4), 474–484 (2010)CrossRefGoogle Scholar
  13. 13.
    Lumsden, J., Brewster, S.: A paradigm shift: alternative interaction techniques for use with mobile & wearable devices. In: Proceedings of the 2003 conference of the Centre for Advanced Studies on Collaborative research, p. 210, IBM Press (2003)Google Scholar
  14. 14.
    MacKenzie, I., Soukoreff, R.: Text entry for mobile computing: models and methods, theory and practice. Hum. Comput. Interact. 17(2), 147–198 (2002)CrossRefGoogle Scholar
  15. 15.
    MacKenzie, I., Soukoreff, R.: Phrase sets for evaluating text entry techniques. In: CHI’03 extended abstracts on Human factors in computing systems, pp. 754–755, ACM (2003)Google Scholar
  16. 16.
    Mizobuchi, S., Chignell, M., Newton, D.: Mobile text entry: relationship between walking speed and text input task difficulty. In: Proceedings of the 7th international conference on Human computer interaction with mobile devices & services, ACM, p. 128 (2005)Google Scholar
  17. 17.
    Mustonen, T., Olkkonen, M., Hakkinen, J.: Examining mobile phone text legibility while walking. In: Conference on Human Factors in Computing Systems, pp. 1243–1246. ACM, New York, NY, USA (2004)Google Scholar
  18. 18.
    Nicolau, H., Jorge, J.: Touch typing using thumbs: understanding the effect of mobility and hand posture. In: Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems (CHI’12), pp. 2683–2686. ACM, New York, NY, USA (2012)Google Scholar
  19. 19.
    Oulasvirta, A., Tamminen, S., Roto, V., Kuorelahti, J.: Interaction in 4-second bursts: the fragmented nature of attentional resources in mobile HCI. In: CHI’05: Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 919–928. ACM, New York, NY, USA (2005)Google Scholar
  20. 20.
    Pallant, J.: SPSS survival manual. McGraw-Hill, New York (2007)Google Scholar
  21. 21.
    Pascoe, J., Ryan, N., Morse, D.: Using while moving: HCI issues in fieldwork environments. ACM Trans. Comput. Hum. Interact. 7(3), 417–437 (2000)CrossRefGoogle Scholar
  22. 22.
    Sawhney, N., Schmandt, C.: Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments. ACM Trans. Comput. Hum. Interact. 7(3), 383 (2000)CrossRefGoogle Scholar
  23. 23.
    Schildbach, B., Rukzio, E.: Investigating selection and reading performance on a mobile phone while walking. In: Proceedings of the 12th international conference on Human computer interaction with mobile devices and services, MobileHCI’10, pp. 93–102. ACM, New York, NY, USA (2010)Google Scholar
  24. 24.
    Sears, A., Lin, M., Jacko, J., Xiao, Y.: When computers fade: pervasive computing and situationally induced impairments and disabilities. Proc. HCI Int. 2, 1298–1302 (2003)Google Scholar
  25. 25.
    Stevens, A., Bygrave, S., Brook-Carter, N., Luke, T.: Occlusion as a technique for measuring in-vehicle information system (ivis) visual distraction: a research literature review. Tech. rep. (2004)Google Scholar
  26. 26.
    Treisman, A., Gormican, S.: Feature analysis in early vision: evidence from search asymmetries. Psychol. Rev. 95(1), 15 (1988)CrossRefGoogle Scholar
  27. 27.
    Wickens, C.: Multiple resources and performance prediction. Ergonomics Psychol. Mech. Model Ergonomics 3, 83 (2005)Google Scholar
  28. 28.
    Wobbrock, J.: The future of mobile device research in HCI. In: CHI 2006 workshop proceedings: what is the next generation of human-computer interaction, pp. 131–134, Citeseer (2006)Google Scholar
  29. 29.
    Yesilada, Y., Chen, T., Harper, S.: A simple solution: solution migration from disabled to small device context. In: Proceedings of the 2010 international cross disciplinary conference on web accessibility (W4A) (W4A’10), pp. 27:1–27:2. ACM, New York, NY, USA (2010)Google Scholar
  30. 30.
    Yfantidis, G., Evreinov, G.: Adaptive blind interaction technique for touch screens. Univ. Access Inf. Soc. 4(4), 328–337 (2006)CrossRefGoogle Scholar
  31. 31.
    Zhao, S., Dragicevic, P., Chignell, M., Balakrishnan, R., Baudisch, P: Earpod: eyes-free menu selection using touch input and reactive audio feedback. In: CHI’07: Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 1395–1404. ACM, New York, NY, USA (2007)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Hugo Nicolau
    • 1
  • Tiago Guerreiro
    • 1
  • David Lucas
    • 1
  • Joaquim Jorge
    • 1
  1. 1.IST/Technical University of Lisbon/INESC-IDPorto SalvoPortugal

Personalised recommendations