Multimedia Tools and Applications

, Volume 76, Issue 4, pp 5141–5169 | Cite as

Analyzing visually impaired people’s touch gestures on smartphones

  • Maria Claudia Buzzi
  • Marina Buzzi
  • Barbara Leporini
  • Amaury Trujillo
Article

Abstract

We present an analysis of how visually impaired people perform gestures on touch-screen smartphones and report their preferences, explaining the procedure and technical implementation that we followed to collect gesture samples. To that end, we recruited 36 visually impaired participants and divided them into two main groups of low-vision and blind people respectively. We then examined their touch-based gesture preferences in terms of number of strokes, multi-touch, and shape angle, as well as their execution in geometric, kinematic and relative terms. For this purpose, we developed a wireless system to simultaneously record sample gestures from several participants, with the possibility of monitoring the capture process. Our results are consistent with previous research regarding the preference of visually impaired users for simple gestures: with one finger, a single stroke, and in one or two cardinal directions. Of the two groups of participants, blind people are less consistent with multi-stroke gestures. In addition, they are more likely than low-vision people to go outside the bounds of the display in the absence of its physical delimitation of, especially with multi-touch gestures. In the case of more complex gestures, rounded shapes are greatly preferred to angular ones, especially by blind people, who have difficulty performing straight gestures with steep or right angles. Based on these results and on previous related research, we offer suggestions to improve gesture accessibility of handheld touchscreen devices.

Keywords

Accessibility Visually impairment Touch gestures Mobile devices Multimodal interfaces 

References

  1. 1.
    Albinsson P-A, Zhai S (2003) High precision touch screen interaction. In: Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, p 105–112Google Scholar
  2. 2.
    Anthony L, Brown Q, Nias J, Tate B, Mohan S (2012) Interaction and recognition challenges in interpreting children’s touch and gesture input on mobile devices. Paper presented at the Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces, Cambridge, Massachusetts, USAGoogle Scholar
  3. 3.
    Anthony L, Vatavu R-D, Wobbrock JO (2013) Understanding the consistency of users’ pen and finger stroke gesture articulation. Paper presented at the Proceedings of Graphics Interface 2013, Regina, Sascatchewan, CanadaGoogle Scholar
  4. 4.
    Anthony L, Wobbrock JO (2010) A lightweight multistroke recognizer for user interface prototypes. In: Proceedings of Graphics Interface 2010. Canadian Information Processing Society, p 245–252Google Scholar
  5. 5.
    Anthony L, Wobbrock JO (2012) $ N-protractor: a fast and accurate multistroke recognizer. In: Proceedings of Graphics Interface 2012. Canadian Information Processing Society, p 117–120Google Scholar
  6. 6.
    Azenkot S, Lee NB (2013) Exploring the Use of Speech Input by Blind People on Mobile Devices. Paper presented at the 15th International ACM SIGACCESS Conference on Computers and Accessibility, New York, NY, USAGoogle Scholar
  7. 7.
    Bayus BL, Jain S, Rao AG (1997) Too little, too early: introduction timing and new product performance in the personal digital assistant industry. J Mark Res 34(1):50–63Google Scholar
  8. 8.
    Benyon D (1993) Accommodating individual differences through an adaptive user interface. Hum Factors Inform Technol 10:149Google Scholar
  9. 9.
    Bragdon A, Nelson E, Li Y, Hinckley K (2011) Experimental analysis of touch-screen gesture designs in mobile environments. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, p 403–412Google Scholar
  10. 10.
    Buzzi MC, Buzzi M, Leporini B, Trujillo A (2014) Designing a text entry multimodal keypad for blind users of touchscreen mobile phones. In: Proceedings of the 16th international ACM SIGACCESS conference on computers & accessibility. ACM, p 131–136Google Scholar
  11. 11.
    Buzzi MC, Buzzi M, Leporini B, Trujillo A (2015) Exploring visually impaired people’s gesture preferences for smartphones. In: Proceedings of the 11th Biannual Conference on Italian SIGCHI Chapter. ACM, p 94–101Google Scholar
  12. 12.
    Charland A, Leroux B (2011) Mobile application development: web vs. native. Commun ACM 54(5):49–53. doi:10.1145/1941487.1941504 CrossRefGoogle Scholar
  13. 13.
    Findlater L, Froehlich JE, Fattal K, Wobbrock JO, Dastyar T (2013) Age-related differences in performance with touchscreens compared to traditional mouse input. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, FranceGoogle Scholar
  14. 14.
    Goldberg D, Richardson C (1993) Touch-typing with a stylus. In: Proceedings of the INTERACT′93 and CHI′93 conference on Human factors in computing systems. ACM, p 80–87Google Scholar
  15. 15.
    Guerreiro T, Lagoá P, Nicolau H, Santana P, Jorge J (2008) Mobile text-entry models for people with disabilities. Paper presented at the 15th European conference on cognitive ergonomicsGoogle Scholar
  16. 16.
    Heo S, Gu J, Lee G (2014) Expanding touch input vocabulary by using consecutive distant taps. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, p 2597–2606Google Scholar
  17. 17.
    Kammer D, Wojdziak J, Keck M, Groh R, Taranko S (2010) Towards a formalization of multi-touch gestures. In: ACM International Conference on Interactive Tabletops and Surfaces. ACM, p 49–58Google Scholar
  18. 18.
    Kane SK, Bigham JP, Wobbrock JO (2008) Slide rule: making mobile touch screens accessible to blind people using multi-touch interaction techniques. In: Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility. ACM, p 73–80Google Scholar
  19. 19.
    Kane SK, Morris MR, Wobbrock JO (2013) Touchplates: low-cost tactile overlays for visually impaired touch screen users. Paper presented at the 15th International ACM SIGACCESSGoogle Scholar
  20. 20.
    Kane SK, Wobbrock JO, Ladner RE (2011) Usable gestures for blind people: understanding preference and performance. Paper presented at the SIGCHI ConferenceGoogle Scholar
  21. 21.
    Kendall MG, Smith BB (1939) The problem of $m$ rankings. p 275–287. doi:10.1214/aoms/1177732186
  22. 22.
    Kristensson P-O, Zhai S (2004) SHARK 2: a large vocabulary shorthand writing system for pen-based computers. In: Proceedings of the 17th annual ACM symposium on user interface software and technology. ACM, p 43–52Google Scholar
  23. 23.
    Lettner F, Holzmann C (2012) Heat maps as a usability tool for multi-touch interaction in mobile applications. In: Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia. ACM, p 49Google Scholar
  24. 24.
    Li Y (2010) Protractor: a fast and accurate gesture recognizer. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, p 2169–2172Google Scholar
  25. 25.
    Long Jr AC, Landay JA, Rowe LA, Michiels J (2000) Visual similarity of pen gestures. In: Proceedings of the SIGCHI conference on Human Factors in Computing Systems. ACM, p 360–367Google Scholar
  26. 26.
    Luthra V, Ghosh S (2015) Understanding, evaluating and analyzing touch screen gestures for visually impaired users in mobile environment. In: Universal Access in Human-Computer Interaction. Access to Interaction. Springer, p 25–36Google Scholar
  27. 27.
    MacKenzie IS, Zhang SX (1997) The immediate usability of Graffiti. In: Graphics Interface. p 129–137Google Scholar
  28. 28.
    McGookin D, Brewster S, Jiang W (2008) Investigating touchscreen accessibility for people with visual impairments. Paper presented at the 5th Nordic conference on Human-computer interactionGoogle Scholar
  29. 29.
    Moore DA (1999) Order effects in preference judgments: evidence for context dependence in the generation of preferences. Organ Behav Hum Decis Process 78(2):146–165MathSciNetCrossRefGoogle Scholar
  30. 30.
    Morris MR, Danielescu A, Drucker S, Fisher D, Lee B, Schraefel MC, Wobbrock JO (2014) Reducing legacy bias in gesture elicitation studies. Interactions 21(3):40–45. doi:10.1145/2591689 CrossRefGoogle Scholar
  31. 31.
    Morris J, Mueller J (2014) Blind and deaf consumer preferences for android and iOS smartphones. In: Langdon PM, Lazar J, Heylighen A, Dong H (eds) Inclusive Designing. Springer International Publishing, p 69–79. doi:10.1007/978-3-319-05095-9_7
  32. 32.
    Myers CS, Rabiner LR (1981) A comparative study of several dynamic time‐warping algorithms for connected‐word recognition. Bell Syst Tech J 60(7):1389–1409CrossRefGoogle Scholar
  33. 33.
    Oh U, Findlater L (2013) The challenges and potential of end-user gesture customization. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, p 1129–1138Google Scholar
  34. 34.
    Oh U, Kane SK, Findlater L (2013) Follow That sound: using sonification and corrective verbal feedback to teach touchscreen gestures. Paper presented at the 15th International ACM SIGACCESS, New York, NY, USAGoogle Scholar
  35. 35.
    Oliveira J, Guerreiro T, Nicolau H, Jorge J, Gonçalves D (2011) Blind people and mobile Touch-based Text-entry: acknowledging the need for different flavors. Paper presented at the 13th International ACM SIGACCESS Conference on Computers and Accessibility, New York, NY, USAGoogle Scholar
  36. 36.
    Oviatt S, Cohen P, Wu L, Duncan L, Suhm B, Bers J, Holzman T, Winograd T, Landay J, Larson J (2000) Designing the user interface for multimodal speech and pen-based gesture applications: state-of-the-art systems and future research directions. Hum Comput Interact 15(4):263–322CrossRefGoogle Scholar
  37. 37.
    Pittman JA (1991) Recognizing handwritten text. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, p 271–275Google Scholar
  38. 38.
    Plamondon R, Srihari SN (2000) Online and off-line handwriting recognition: a comprehensive survey. IEEE Trans Pattern Anal Mach Intell 22(1):63–84CrossRefGoogle Scholar
  39. 39.
    Plimmer B, Crossan A, Brewster SA, Blagojevic R (2008) Multimodal collaborative handwriting training for visually-impaired people. Paper presented at the SIGCHI ConferenceGoogle Scholar
  40. 40.
    Postma A, Zuidhoek S, Noordzij ML, Kappers AM (2007) Differences between early blind, late blind and blindfolded sighted people in haptic spatial configuration learning and resulting memory traces. Perception 36(8):1253–1265CrossRefGoogle Scholar
  41. 41.
    Rekik Y, Vatavu R-D, Grisoni L (2014) Understanding users’ perceived difficulty of multi-touch gesture articulation. In: Proceedings of the 16th International Conference on Multimodal Interaction. ACM, p 232–239Google Scholar
  42. 42.
    Rivera J, van der Meulen R (2014) Gartner says sales of smartphones grew 20 percent in third quarter of 2014. Gartner, EghamGoogle Scholar
  43. 43.
    Rodrigues A, Montague K, Nicolau H, Guerreiro T (2015) Getting smartphones to TalkBack: understanding the smartphone adoption process of blind users. Proceedings of ASSETS′15Google Scholar
  44. 44.
    Romano M, Bellucci A, Aedo I (2015) Understanding touch and motion gestures for blind people on mobile devices. In: Human-Computer Interaction–INTERACT 2015. Springer, p 38–46Google Scholar
  45. 45.
    Rubine D (1991) Specifying gestures by example. Paper presented at the proceedings of the 18th annual conference on computer graphics and interactive techniquesGoogle Scholar
  46. 46.
    Ruiz J, Li Y, Lank E (2011) User-defined motion gestures for mobile interaction. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, CanadaGoogle Scholar
  47. 47.
    Sandnes F, Tan T, Johansen A, Sulic E, Vesterhus E, Iversen E (2012) Making touch-based kiosks accessible to blind users through simple gestures. Univ Access Inf Soc 11(4):421–431. doi:10.1007/s10209-011-0258-4 CrossRefGoogle Scholar
  48. 48.
    Schmidt M, Weber G (2009) Multitouch haptic interaction. In: Universal access in human-computer interaction. Intelligent and ubiquitous interaction environments, vol 5615. Lecture notes in computer science. Springer, Berlin Heidelberg, pp 574–582. doi:10.1007/978-3-642-02710-9_64
  49. 49.
    Sears A, Hanson V (2011) Representing users in accessibility research. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, CanadaGoogle Scholar
  50. 50.
    Sezgin TM, Davis R (2005) HMM-based efficient sketch recognition. In: Proceedings of the 10th international conference on intelligent user interfaces. ACM, p 281–283Google Scholar
  51. 51.
    Spano LD, Cisternino A, Paternò F (2012) A compositional model for gesture definition. In: Human-Centered Software Engineering. Springer, p 34–52Google Scholar
  52. 52.
    Vatavu R-D, Casiez G, Grisoni L (2013) Small, medium, or large?: estimating the user-perceived scale of stroke gestures. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, FranceGoogle Scholar
  53. 53.
    Vatavu R-D, Anthony L, Wobbrock JO (2012) Gestures as point clouds: a $ P recognizer for user interface prototypes. In: Proceedings of the 14th ACM international conference on multimodal interaction. ACM, p 273–280Google Scholar
  54. 54.
    Vatavu R-D, Anthony L, Wobbrock JO (2013) Relative accuracy measures for stroke gestures. Paper presented at the Proceedings of the 15th ACM on international conference on multimodal interaction, Sydney, AustraliaGoogle Scholar
  55. 55.
    Vatavu R-D, Anthony L, Wobbrock JO (2014) Gesture Heatmaps: understanding gesture performance with colorful visualizations. Paper presented at the Proceedings of the 16th International Conference on Multimodal Interaction, Istanbul, TurkeyGoogle Scholar
  56. 56.
    Vidal S, Lefebvre G (2010) Gesture based interaction for visually-impaired people. In: Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries. ACM, p 809–812Google Scholar
  57. 57.
    Walker G (2012) A review of technologies for sensing contact location on the surface of a display. J Soc Inf Disp 20(8):413–440CrossRefGoogle Scholar
  58. 58.
    Westerman W, Elias JG, Hedge A (2001) Multi-touch: A new tactile 2-d gesture interface for human-computer interaction. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol 6. SAGE Publications, p 632–636Google Scholar
  59. 59.
    Willems D, Niels R, van Gerven M, Vuurpijl L (2009) Iconic and multi-stroke gesture recognition. Pattern Recogn 42(12):3303–3312CrossRefMATHGoogle Scholar
  60. 60.
    Wobbrock JO, Aung HH, Rothrock B, Myers BA (2005) Maximizing the guessability of symbolic input. In: CHI′05 extended abstracts on Human Factors in Computing Systems. ACM, p 1869–1872Google Scholar
  61. 61.
    Wobbrock JO, Wilson AD, Li Y (2007) Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. In: Proceedings of the 20th annual ACM symposium on User interface software and technology. ACM, p 159–168Google Scholar
  62. 62.
    Xu S (2015) Improving accessibility design on touchscreens. In: Universal Access in Human-Computer Interaction. Access to Interaction. Springer, p 161–173Google Scholar
  63. 63.
    Zhong Y, Raman T, Burkhardt C, Biadsy F, Bigham JP (2014) JustSpeak: enabling universal voice control on Android. In: Proceedings of the 11th Web for All Conference. ACM, p 36Google Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  • Maria Claudia Buzzi
    • 1
  • Marina Buzzi
    • 1
  • Barbara Leporini
    • 2
  • Amaury Trujillo
    • 1
  1. 1.IIT – CNRPisaItaly
  2. 2.ISTI – CNRPisaItaly

Personalised recommendations