Skip to main content
Log in

Analyzing visually impaired people’s touch gestures on smartphones

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

We present an analysis of how visually impaired people perform gestures on touch-screen smartphones and report their preferences, explaining the procedure and technical implementation that we followed to collect gesture samples. To that end, we recruited 36 visually impaired participants and divided them into two main groups of low-vision and blind people respectively. We then examined their touch-based gesture preferences in terms of number of strokes, multi-touch, and shape angle, as well as their execution in geometric, kinematic and relative terms. For this purpose, we developed a wireless system to simultaneously record sample gestures from several participants, with the possibility of monitoring the capture process. Our results are consistent with previous research regarding the preference of visually impaired users for simple gestures: with one finger, a single stroke, and in one or two cardinal directions. Of the two groups of participants, blind people are less consistent with multi-stroke gestures. In addition, they are more likely than low-vision people to go outside the bounds of the display in the absence of its physical delimitation of, especially with multi-touch gestures. In the case of more complex gestures, rounded shapes are greatly preferred to angular ones, especially by blind people, who have difficulty performing straight gestures with steep or right angles. Based on these results and on previous related research, we offer suggestions to improve gesture accessibility of handheld touchscreen devices.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Albinsson P-A, Zhai S (2003) High precision touch screen interaction. In: Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, p 105–112

  2. Anthony L, Brown Q, Nias J, Tate B, Mohan S (2012) Interaction and recognition challenges in interpreting children’s touch and gesture input on mobile devices. Paper presented at the Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces, Cambridge, Massachusetts, USA

  3. Anthony L, Vatavu R-D, Wobbrock JO (2013) Understanding the consistency of users’ pen and finger stroke gesture articulation. Paper presented at the Proceedings of Graphics Interface 2013, Regina, Sascatchewan, Canada

  4. Anthony L, Wobbrock JO (2010) A lightweight multistroke recognizer for user interface prototypes. In: Proceedings of Graphics Interface 2010. Canadian Information Processing Society, p 245–252

  5. Anthony L, Wobbrock JO (2012) $ N-protractor: a fast and accurate multistroke recognizer. In: Proceedings of Graphics Interface 2012. Canadian Information Processing Society, p 117–120

  6. Azenkot S, Lee NB (2013) Exploring the Use of Speech Input by Blind People on Mobile Devices. Paper presented at the 15th International ACM SIGACCESS Conference on Computers and Accessibility, New York, NY, USA

  7. Bayus BL, Jain S, Rao AG (1997) Too little, too early: introduction timing and new product performance in the personal digital assistant industry. J Mark Res 34(1):50–63

  8. Benyon D (1993) Accommodating individual differences through an adaptive user interface. Hum Factors Inform Technol 10:149

    Google Scholar 

  9. Bragdon A, Nelson E, Li Y, Hinckley K (2011) Experimental analysis of touch-screen gesture designs in mobile environments. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, p 403–412

  10. Buzzi MC, Buzzi M, Leporini B, Trujillo A (2014) Designing a text entry multimodal keypad for blind users of touchscreen mobile phones. In: Proceedings of the 16th international ACM SIGACCESS conference on computers & accessibility. ACM, p 131–136

  11. Buzzi MC, Buzzi M, Leporini B, Trujillo A (2015) Exploring visually impaired people’s gesture preferences for smartphones. In: Proceedings of the 11th Biannual Conference on Italian SIGCHI Chapter. ACM, p 94–101

  12. Charland A, Leroux B (2011) Mobile application development: web vs. native. Commun ACM 54(5):49–53. doi:10.1145/1941487.1941504

    Article  Google Scholar 

  13. Findlater L, Froehlich JE, Fattal K, Wobbrock JO, Dastyar T (2013) Age-related differences in performance with touchscreens compared to traditional mouse input. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France

  14. Goldberg D, Richardson C (1993) Touch-typing with a stylus. In: Proceedings of the INTERACT′93 and CHI′93 conference on Human factors in computing systems. ACM, p 80–87

  15. Guerreiro T, Lagoá P, Nicolau H, Santana P, Jorge J (2008) Mobile text-entry models for people with disabilities. Paper presented at the 15th European conference on cognitive ergonomics

  16. Heo S, Gu J, Lee G (2014) Expanding touch input vocabulary by using consecutive distant taps. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, p 2597–2606

  17. Kammer D, Wojdziak J, Keck M, Groh R, Taranko S (2010) Towards a formalization of multi-touch gestures. In: ACM International Conference on Interactive Tabletops and Surfaces. ACM, p 49–58

  18. Kane SK, Bigham JP, Wobbrock JO (2008) Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques. In: Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility. ACM, p 73–80

  19. Kane SK, Morris MR, Wobbrock JO (2013) Touchplates: low-cost tactile overlays for visually impaired touch screen users. Paper presented at the 15th International ACM SIGACCESS

  20. Kane SK, Wobbrock JO, Ladner RE (2011) Usable gestures for blind people: understanding preference and performance. Paper presented at the SIGCHI Conference

  21. Kendall MG, Smith BB (1939) The problem of $m$ rankings. p 275–287. doi:10.1214/aoms/1177732186

  22. Kristensson P-O, Zhai S (2004) SHARK 2: a large vocabulary shorthand writing system for pen-based computers. In: Proceedings of the 17th annual ACM symposium on user interface software and technology. ACM, p 43–52

  23. Lettner F, Holzmann C (2012) Heat maps as a usability tool for multi-touch interaction in mobile applications. In: Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia. ACM, p 49

  24. Li Y (2010) Protractor: a fast and accurate gesture recognizer. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, p 2169–2172

  25. Long Jr AC, Landay JA, Rowe LA, Michiels J (2000) Visual similarity of pen gestures. In: Proceedings of the SIGCHI conference on Human Factors in Computing Systems. ACM, p 360–367

  26. Luthra V, Ghosh S (2015) Understanding, evaluating and analyzing touch screen gestures for visually impaired users in mobile environment. In: Universal Access in Human-Computer Interaction. Access to Interaction. Springer, p 25–36

  27. MacKenzie IS, Zhang SX (1997) The immediate usability of Graffiti. In: Graphics Interface. p 129–137

  28. McGookin D, Brewster S, Jiang W (2008) Investigating touchscreen accessibility for people with visual impairments. Paper presented at the 5th Nordic conference on Human-computer interaction

  29. Moore DA (1999) Order effects in preference judgments: evidence for context dependence in the generation of preferences. Organ Behav Hum Decis Process 78(2):146–165

    Article  MathSciNet  Google Scholar 

  30. Morris MR, Danielescu A, Drucker S, Fisher D, Lee B, Schraefel MC, Wobbrock JO (2014) Reducing legacy bias in gesture elicitation studies. Interactions 21(3):40–45. doi:10.1145/2591689

    Article  Google Scholar 

  31. Morris J, Mueller J (2014) Blind and deaf consumer preferences for android and iOS smartphones. In: Langdon PM, Lazar J, Heylighen A, Dong H (eds) Inclusive Designing. Springer International Publishing, p 69–79. doi:10.1007/978-3-319-05095-9_7

  32. Myers CS, Rabiner LR (1981) A comparative study of several dynamic time‐warping algorithms for connected‐word recognition. Bell Syst Tech J 60(7):1389–1409

    Article  Google Scholar 

  33. Oh U, Findlater L (2013) The challenges and potential of end-user gesture customization. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, p 1129–1138

  34. Oh U, Kane SK, Findlater L (2013) Follow That sound: using sonification and corrective verbal feedback to teach touchscreen gestures. Paper presented at the 15th International ACM SIGACCESS, New York, NY, USA

  35. Oliveira J, Guerreiro T, Nicolau H, Jorge J, Gonçalves D (2011) Blind people and mobile Touch-based Text-entry: acknowledging the need for different flavors. Paper presented at the 13th International ACM SIGACCESS Conference on Computers and Accessibility, New York, NY, USA

  36. Oviatt S, Cohen P, Wu L, Duncan L, Suhm B, Bers J, Holzman T, Winograd T, Landay J, Larson J (2000) Designing the user interface for multimodal speech and pen-based gesture applications: state-of-the-art systems and future research directions. Hum Comput Interact 15(4):263–322

    Article  Google Scholar 

  37. Pittman JA (1991) Recognizing handwritten text. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, p 271–275

  38. Plamondon R, Srihari SN (2000) Online and off-line handwriting recognition: a comprehensive survey. IEEE Trans Pattern Anal Mach Intell 22(1):63–84

    Article  Google Scholar 

  39. Plimmer B, Crossan A, Brewster SA, Blagojevic R (2008) Multimodal collaborative handwriting training for visually-impaired people. Paper presented at the SIGCHI Conference

  40. Postma A, Zuidhoek S, Noordzij ML, Kappers AM (2007) Differences between early blind, late blind and blindfolded sighted people in haptic spatial configuration learning and resulting memory traces. Perception 36(8):1253–1265

    Article  Google Scholar 

  41. Rekik Y, Vatavu R-D, Grisoni L (2014) Understanding users’ perceived difficulty of multi-touch gesture articulation. In: Proceedings of the 16th International Conference on Multimodal Interaction. ACM, p 232–239

  42. Rivera J, van der Meulen R (2014) Gartner says sales of smartphones grew 20 percent in third quarter of 2014. Gartner, Egham

    Google Scholar 

  43. Rodrigues A, Montague K, Nicolau H, Guerreiro T (2015) Getting smartphones to TalkBack: understanding the smartphone adoption process of blind users. Proceedings of ASSETS′15

  44. Romano M, Bellucci A, Aedo I (2015) Understanding touch and motion gestures for blind people on mobile devices. In: Human-Computer Interaction–INTERACT 2015. Springer, p 38–46

  45. Rubine D (1991) Specifying gestures by example. Paper presented at the proceedings of the 18th annual conference on computer graphics and interactive techniques

  46. Ruiz J, Li Y, Lank E (2011) User-defined motion gestures for mobile interaction. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada

  47. Sandnes F, Tan T, Johansen A, Sulic E, Vesterhus E, Iversen E (2012) Making touch-based kiosks accessible to blind users through simple gestures. Univ Access Inf Soc 11(4):421–431. doi:10.1007/s10209-011-0258-4

    Article  Google Scholar 

  48. Schmidt M, Weber G (2009) Multitouch haptic interaction. In: Universal access in human-computer interaction. Intelligent and ubiquitous interaction environments, vol 5615. Lecture notes in computer science. Springer, Berlin Heidelberg, pp 574–582. doi:10.1007/978-3-642-02710-9_64

  49. Sears A, Hanson V (2011) Representing users in accessibility research. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada

  50. Sezgin TM, Davis R (2005) HMM-based efficient sketch recognition. In: Proceedings of the 10th international conference on intelligent user interfaces. ACM, p 281–283

  51. Spano LD, Cisternino A, Paternò F (2012) A compositional model for gesture definition. In: Human-Centered Software Engineering. Springer, p 34–52

  52. Vatavu R-D, Casiez G, Grisoni L (2013) Small, medium, or large?: estimating the user-perceived scale of stroke gestures. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France

  53. Vatavu R-D, Anthony L, Wobbrock JO (2012) Gestures as point clouds: a $ P recognizer for user interface prototypes. In: Proceedings of the 14th ACM international conference on multimodal interaction. ACM, p 273–280

  54. Vatavu R-D, Anthony L, Wobbrock JO (2013) Relative accuracy measures for stroke gestures. Paper presented at the Proceedings of the 15th ACM on international conference on multimodal interaction, Sydney, Australia

  55. Vatavu R-D, Anthony L, Wobbrock JO (2014) Gesture Heatmaps: understanding gesture performance with colorful visualizations. Paper presented at the Proceedings of the 16th International Conference on Multimodal Interaction, Istanbul, Turkey

  56. Vidal S, Lefebvre G (2010) Gesture based interaction for visually-impaired people. In: Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries. ACM, p 809–812

  57. Walker G (2012) A review of technologies for sensing contact location on the surface of a display. J Soc Inf Disp 20(8):413–440

    Article  Google Scholar 

  58. Westerman W, Elias JG, Hedge A (2001) Multi-touch: A new tactile 2-d gesture interface for human-computer interaction. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol 6. SAGE Publications, p 632–636

  59. Willems D, Niels R, van Gerven M, Vuurpijl L (2009) Iconic and multi-stroke gesture recognition. Pattern Recogn 42(12):3303–3312

    Article  MATH  Google Scholar 

  60. Wobbrock JO, Aung HH, Rothrock B, Myers BA (2005) Maximizing the guessability of symbolic input. In: CHI′05 extended abstracts on Human Factors in Computing Systems. ACM, p 1869–1872

  61. Wobbrock JO, Wilson AD, Li Y (2007) Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. In: Proceedings of the 20th annual ACM symposium on User interface software and technology. ACM, p 159–168

  62. Xu S (2015) Improving accessibility design on touchscreens. In: Universal Access in Human-Computer Interaction. Access to Interaction. Springer, p 161–173

  63. Zhong Y, Raman T, Burkhardt C, Biadsy F, Bigham JP (2014) JustSpeak: enabling universal voice control on Android. In: Proceedings of the 11th Web for All Conference. ACM, p 36

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Maria Claudia Buzzi.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Buzzi, M.C., Buzzi, M., Leporini, B. et al. Analyzing visually impaired people’s touch gestures on smartphones. Multimed Tools Appl 76, 5141–5169 (2017). https://doi.org/10.1007/s11042-016-3594-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-016-3594-9

Keywords

Navigation