LOVIE: A Word List Optimized for Visually Impaired UsErs on Smartphones

  • Philippe RoussilleEmail author
  • Mathieu Raynal
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9738)


On mobile devices, most text entry systems offer assistance through the means of prediction or correction word lists. These lists can hardly be used by visually impaired users in a short time. In this paper, we present three studies to find an interaction which minimizes the time required to select a word from a list without knowing its elements a priori. First, we explored the various possible item layouts, to conclude that a linear spatial arrangement is preferable. Then we studied the audio feedback, and we have determined that providing on-demand audio feedback is enough. Finally, we proposed four validation interactions to retain a validation based on the principle of a press-release gesture.


Word List Item List Selection Task Audio Feedback Impaired User 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Borodin, Y., Bigham, J.P., Dausch, G., Ramakrishnan, I.V.: More than meets the eye: a survey of screen-reader browsingstrategies. In: Proceedings of the 2010 International Cross Disciplinary Conference on Web Accessibility (W4A), W4A 2010, pp. 13:1–13:10. ACM, New York (2010)Google Scholar
  2. 2.
    Brewster, S., Lumsden, J., Bell, M., Hall, M., Tasker, S.: Multimodal ‘eyes-free’ interaction techniques for wearable devices. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2003, pp. 473–480. ACM, New York (2003)Google Scholar
  3. 3.
    Chae, M., Kim, J.: Do size and structure matter to mobile users? an empirical study of the effects of screen size, information structure, and task complexity on user activities with standard web phones. Behav. Inf. Technol. 23(3), 165–181 (2004)CrossRefGoogle Scholar
  4. 4.
    Eberts, R.: Cognitive modeling. In: Handbook of Human Factors and Ergonomics, pp. 1328–1374 (1997)Google Scholar
  5. 5.
    Geven, A., Sefelin, R., Tscheligi, M.: Depth and breadth away from the desktop: the optimal information hierarchy for mobile use. In: Proceedings of the 8th Conference on Human-Computer Interaction with Mobile Devices and Services, MobileHCI 2006, pp. 157–164. ACM, New York (2006)Google Scholar
  6. 6.
    Helle, S., LePlâtre, G., Marila, J., Laine, P.: Menu sonification in a mobile phone a prototype study. In: ICAD 2001, pp. 255–260 (2001)Google Scholar
  7. 7.
    Kane, S.K., Bigham, J.P., Wobbrock, J.O.: Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques. In: Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, Assets 2008, pp. 73–80. ACM, NewYork (2008)Google Scholar
  8. 8.
    Kiger, J.I.: The depth/breadth trade-off in the design of menu-driven user interfaces. Int. J. Man Mach. Stud. 20(2), 201–213 (1984)CrossRefGoogle Scholar
  9. 9.
    Mitchell, J., Shneiderman, B.: Dynamic versus static menus: an exploratory comparison. ACM SigCHI Bull. 20(4), 33–37 (1989)CrossRefGoogle Scholar
  10. 10.
    Norman, K.L.: The Psychology of Menu Selection: Designing Cognitive Control at the Human/Computer Interface. Intellect Books, Oxford (1991)Google Scholar
  11. 11.
    Sears, A., Shneiderman, B.: Split menus: effectively using selection frequency to organize menus. ACM Trans. Comput. Hum. Interact. (TOCHI) 1(1), 27–51 (1994)CrossRefGoogle Scholar
  12. 12.
    Shneiderman, B.: Designing menu selection systems. J. Am. Soc. Inf. Sci. 37(2), 57 (1986). (1986–1998)CrossRefGoogle Scholar
  13. 13.
    Walker, B.N., Kogan, A.: Spearcon performance and preference for auditory menus on a mobile phone. In: Stephanidis, C. (ed.) UAHCI 2009, Part II. LNCS, vol. 5615, pp. 445–454. Springer, Heidelberg (2009)Google Scholar
  14. 14.
    Wilson, G., Brewster, S.A.: Pressure-based menu selection for mobile devices. In: Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services, pp. 181–190 (2010)Google Scholar
  15. 15.
    Yalla, P., Walker, B.N.: Advanced auditory menus. Georgia Institute of Technology Technical report, Atlanta, October 2007Google Scholar
  16. 16.
    Zhao, S., Dragicevic, P., Chignell, M., Balakrishnan, R., Baudisch, P.: Earpod: eyes-free menu selection using touch input and reactive audiofeedback. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2007, pp. 1395–1404. ACM, New York (2007)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.CNRS, IRITToulouseFrance

Personalised recommendations