Understanding Touch and Motion Gestures for Blind People on Mobile Devices

  • Marco RomanoEmail author
  • Andrea Bellucci
  • Ignacio Aedo
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9296)


Considering users preferences and behaviour is a necessity to develop accessible interaction for blind people. Mainstream mobile devices are widely used by people with disabilities but, despite the growing interest of the research community around accessibility issues of touch interfaces, there is still much to understand about how best to design the interaction of blind people with mobile technologies. To this end, we conducted a preliminary elicitation study (8 participants) to understand how blind people perform touch and motion gestures for common tasks on a mobile phone. We found that blind people do not use motion gestures. We provide a discussion of our results according to the type of gestures performed.


User-defined Gestures Blind Accessibility Touch screens 



This work is supported by the project CREAx grant funded by the Spanish Ministry of Economy and Competitivity (TIN2014-56534-R). We also thank the collaboration and the support of the ONCE organization of Madrid.


  1. 1.
    Abascal, J., Civit, A.: Mobile communication for people with disabilities and older people: new opportunities for autonomous life. In: Proceedings of ERCIM Workshop 2000, pp. 255–268 (2000)Google Scholar
  2. 2.
    Bellucci, A., Malizia, A., Diaz, P., Aedo, I.: Don’t touch me: multi-user annotations on a map in large display environments. In: Proceedings of AVI 2010, pp. 391–392, ACM (2010)Google Scholar
  3. 3.
    Di Chiara, G., Paolino, L., Romano, M., Sebillo, M., Tortora, G., Vitiello, G., Ginige, A.: The framy user interface for visually-impaired users. In: 2011 Sixth International Conference on Digital Information Management (ICDIM), vol., no., pp. 36–41, 26–28 Sept 2011Google Scholar
  4. 4.
    Echtler, F., Klinker, G.: A multitouch software architecture. In: Proceedings of NordiCHI 2008, p. 463, ACM Press (2008)Google Scholar
  5. 5.
    Jones, E., Alexander, J., Andreou, A.: GesText: accelerometer-based gestural text-entry systems. In: Proceedings of CHI 2010, pp. 2173–2182 (2010)Google Scholar
  6. 6.
    Kane, S., Bigham, J., Wobbrock, J.: Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques. In: Proceedings of SIGACCESS 2008, pp. 73–80, ACM Press (2008)Google Scholar
  7. 7.
    Kane, S.K., Jayant, C., Wobbrock, J.O., Ladner, R.E.: Freedom to roam: a study of mobile device adoption and accessibility for people with visual and motor disabilities. In: Proceedings of SIGACCESS 2009, pp. 115–122, ACM Press (2009)Google Scholar
  8. 8.
    Kane, S., Wobbrock, J., Ladner, R.: Usable gestures for blind people: understanding preference and performance. In: Proceedings of CHI 2011, pp. 413–422, ACM Press (2011)Google Scholar
  9. 9.
    Kristensson, P.O., Clawson, J., Dunlop, M., et al.: Designing and evaluating text entry methods. In: CHI EA 2012, p. 2747, ACM Press (2012)Google Scholar
  10. 10.
    Landau, S., Wells, L.: Merging tactile sensory input and audio data by means of the talking tactile tablet. In: Proceedings of EuroHaptics 2003, vol. 60, pp. 291–297 (2003)Google Scholar
  11. 11.
    Manduchi, R., Coughlan, J.M.: Portable and mobile systems in assistive technology. In: Miesenberger, K., Klaus, J., Zagler, W.L., Karshmer, A.I. (eds.) ICCHP 2008. LNCS, vol. 5105, pp. 1078–1080. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  12. 12.
    Mauney, D., Howarth, J., Wirtanen, A., Capra, M.: Cultural similarities and differences in user-defined gestures for touchscreen user interfaces. In: CHI EA 2011, pp. 4015–4020, ACM Press (2010)Google Scholar
  13. 13.
    Morris, M., Wobbrock, J., Wilson, A.: Understanding users’ preferences for surface gestures. In: Proceedings of GI 2010, Canadian Information Processing Society, pp. 261–268 (2010)Google Scholar
  14. 14.
    Postma, A., Zuidhoek, S., Noordzij, M.L., Kappers, A.M.: Differences between early-blind, late-blind, and blindfolded-sighted people in haptic spatial-configuration learning and resulting memory traces. Perception 36(8), 1253–1265 (2007)CrossRefGoogle Scholar
  15. 15.
    Rekimoto, J.: Smartskin: an infrastructure for freehand manipulation on interactive surfaces. In: Proceedings of CHI 2002, pp. 113–120, ACM Press (2002)Google Scholar
  16. 16.
    Rekimoto, J.: Tilting operations for small screen interfaces. In: Proceedings of UIST 2016, pp. 2–3, ACM Press (1996)Google Scholar
  17. 17.
    Ruiz, J., Li, Y., Lank, E.: User-defined motion gestures for mobile interaction. In: Proceedings of CHI 2011, pp. 197–206, ACM Press (2011)Google Scholar
  18. 18.
    Sánchez, J., Aguayo, F.: Mobile messenger for the blind. In: Stephanidis, C., Pieper, M. (eds.) ERCIM Ws UI4ALL 2006. LNCS, vol. 4397, pp. 369–385. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  19. 19.
    Schuler, D., Namioka, A.: Participatory design: Principles and Practices. L. Erlbaum Assoc. Inc., Hillsdale (1993)Google Scholar
  20. 20.
    Wobbrock, J., Morris, M., Wilson, A.: User-defined gestures for surface computing. In: Proceedings of CHI 2009, pp. 1083–1092, ACM Press (2009)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2015

Authors and Affiliations

  1. 1.Universidad Carlos III de MadridLeganés, MadridSpain

Personalised recommendations