Journal of Computer Science and Technology

, Volume 29, Issue 5, pp 812–824 | Cite as

Designing Motion Gesture Interfaces in Mobile Phones for Blind People

Regular Paper

Abstract

Despite the existence of advanced functions in smartphones, most blind people are still using old-fashioned phones with familiar layouts and dependence on tactile buttons. Smartphones support accessibility features including vibration, speech and sound feedback, and screen readers. However, these features are only intended to provide feedback to user commands or input. It is still a challenge for blind people to discover functions on the screen and to input the commands. Although voice commands are supported in smartphones, these commands are difficult for a system to recognize in noisy environments. At the same time, smartphones are integrated with sophisticated motion sensors, and motion gestures with device tilt have been gaining attention for eyes-free input. We believe that these motion gesture interactions offer more efficient access to smartphone functions for blind people. However, most blind people are not smartphone users and they are aware of neither the affordances available in smartphones nor the potential for interaction through motion gestures. To investigate the most usable gestures for blind people, we conducted a user-defined study with 13 blind participants. Using the gesture set and design heuristics from the user study, we implemented motion gesture based interfaces with speech and vibration feedback for browsing phone books and making a call. We then conducted a second study to investigate the usability of the motion gesture interface and user experiences using the system. The findings indicated that motion gesture interfaces are more efficient than traditional button interfaces. Through the study results, we provided implications for designing smartphone interfaces.

Keywords

design motion gesture user-defined study 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Supplementary material

11390_2014_1470_MOESM1_ESM.pdf (88 kb)
ESM 1(PDF 90 kb)

References

  1. [1]
    Kane S K, Jayant C, Wobbrock J O, Ladner R E. Freedom to roam: A study of mobile device adoption and accessibility for people with visual and motor disabilities. In Proc. the 11th ASSETS, October 2009, pp.115–122.Google Scholar
  2. [2]
    Poggi I. From a typology of gestures to a procedure for gesture production. In Proc. Int. Gesture Workshop, April 2001, pp.158–168.Google Scholar
  3. [3]
    Poggi I, Pelachaud C, Caldognetto E M. Gestural mind makers in ECAs. In Proc. the 2nd AAMAS, July 2003, pp.1098–1099.Google Scholar
  4. [4]
    Iverson J M, Goldin-Meadow S. Why people gesture when they speak? Nature, 1998, 396(6708): 228.CrossRefGoogle Scholar
  5. [5]
    Karam M, Schraefel M C. A taxonomy of gestures in human computer interaction. Technical Report, ECSTR-IAM05-009, Electronics and Computer Science, University of Southampton, 2005.Google Scholar
  6. [6]
    Wobbrock J O, Morris M R, Wilson A D. User-defined gestures for surface computing. In Proc. the 27th CHI, April 2009, pp.1083–1092.Google Scholar
  7. [7]
    Ruiz J, Li Y, Lank E. User-defined motion gestures for mobile interaction. In Proc. CHI, May 2011, pp.197–206.Google Scholar
  8. [8]
    Rekimoto J. Tilting operations for small screen interfaces. In Proc. the 9th UIST, November 1996, pp.167–168.Google Scholar
  9. [9]
    Metzger C, Anderson M, Starner T. FreeDigiter: A contact-free device for gesture control. In Proc. the 8th ISWC, Oct. 31–Nov. 3, 2004, pp.18–21.Google Scholar
  10. [10]
    Jones E, Alexander J, Andreou A, Irani P, Subramanian S. GestText: Accelerometer-based gestural text entry system. In Proc. CHI, April 2010, pp.2173–2182.Google Scholar
  11. [11]
    Wigdor D, Balakrishnan R. TiltText: Using tilt for text input to mobile phones. In Proc. the 16th UIST, Nov. 2003, pp.81–90.Google Scholar
  12. [12]
    Weberg L, Brange T, Hansson A W. A piece of butter on the PDA display. In Proc. CHI, March 2001, pp.435–436.Google Scholar
  13. [13]
    Liu J, Zhong L, Wickramasuriya J et al. User evaluation of lightweight user authentication with a single tri-axis accelerometer. In Proc. the 11th MobileHCI, Sept. 2009, Article No. 15.Google Scholar
  14. [14]
    Kray C, Nesbitt D, Dawson J, Rohs M. User-defined gestures for connecting mobile phones, public displays and tabletops. In Proc. the 12th MobileHCI, September 2010, pp.239–248.Google Scholar
  15. [15]
    Kurdyukova E, Redlin M, Andre E. Studying user-defined iPad gestures for interaction in multi-display environment. In Proc. IUI, February 2012, pp.93–96.Google Scholar
  16. [16]
    Liang H, Williams C, Semegen M, Stuerzlinger W, Irani P. User-defined surface + motion gestures for 3D manipulation of objects at distance through a mobile device. In Proc. the 10th APCHI, August 2012, pp.299–308.Google Scholar
  17. [17]
    Obaid M, Häaring M, Kistler F, Bäuhling R, Andre E. User-defined body gestures for navigational control of a humanoid robot. In Proc. the 4th ICSR, October 2012, pp.367–377.Google Scholar
  18. [18]
    Vatavu R. User-defined gestures for free-hand TV control. In Proc. the 10th EuroITV, July 2012, pp.45–48.Google Scholar
  19. [19]
    Lee S, Kim S, Jin B et al. How users manipulate deformable displays as input devices. In Proc. CHI, April 2010, pp.1647–1656.Google Scholar
  20. [20]
    Piumsomboon T, Billinghurst M, Clark A, Cockburn A. User-defined gestures for augmented reality. In Proc. CHI, April 27–May 2, 2013, pp.955–960.Google Scholar
  21. [21]
    Jung H, Qin S. User-defined gesture sets using a mobile device for people with communication difficulties. In Proc. the 17th ICAC, Sept. 2011, pp.34–39.Google Scholar
  22. [22]
    Kane S K, Wobbrock J O, Ladner R E. Usable gestures for blind people, understanding preference and performance. In Proc. CHI, May 2011, pp.413–422.Google Scholar
  23. [23]
    Kane S K, Bigham J P, Wobbrock J O. Slide rule: Making mobile touch screens accessible to blind people using multi-touch interaction techniques. In Proc. the 10th ASSETS, Oct. 2008, pp.73–80.Google Scholar
  24. [24]
    Obaid M, Häaring M, Kistler F, Bäuhling R, Andre E. User-defined body gestures for navigational control of a humanoid robot. In Proc. the 4th ICSR, October 2012, pp.367–377.Google Scholar
  25. [25]
    Vatavu R. User-defined gestures for free-hand TV control. In Proc. the 10th EuroITV, July 2012, pp.45–48.Google Scholar
  26. [26]
    Lee S, Kim S, Jin B et al. How users manipulate deformable displays as input devices. In Proc. CHI, April 2010, pp.1647–1656.Google Scholar
  27. [27]
    Piumsomboon T, Billinghurst M, Clark A, Cockburn A. User-defined gestures for augmented reality. In Proc. CHI, April 27–May 2, 2013, pp.955–960.Google Scholar
  28. [28]
    Jung H, Qin S. User-defined gesture sets using a mobile device for people with communication di ± culties. In Proc. the 17th ICAC, Sept. 2011, pp.34–39.Google Scholar
  29. [29]
    Kane S K, Wobbrock J O, Ladner R E. Usable gestures for blind people, understanding preference and performance. In Proc. CHI, May 2011, pp.413–422.Google Scholar
  30. [30]
    Kane S K, Bigham J P, Wobbrock J O. Slide rule: Making mobile touch screens accessible to blind people using multi-touch interaction techniques. In Proc. the 10th ASSETS, Oct. 2008, pp.73–80.Google Scholar
  31. [31]
    Landau S, Wells L. Merging tactile sensory input and audio data by means of the talking tactile tablet. In Proc. Euro-Haptics, July 2003, pp.414–418.Google Scholar
  32. [32]
    Pirhonen A, Brewster S, Holguin C. Gestural and audio metaphors as a means of control for mobile devices. In Proc. CHI, April 2002, pp.291–298.Google Scholar
  33. [33]
    O'Neill E, Kaenampornpan M, Kostakos V, Warr A, Woodgate D. Can we do without GUIs? Gesture and speech interaction with a patient information system. Personal and Ubiquitous Computing, 2006, 10(5): 269–283.CrossRefGoogle Scholar
  34. [34]
    Zhao S, Dragicevic P, Chignell M et al. Earpod: Eyes-free menu selection using touch input and reactive audio feedback. In Proc. CHI, April 2007, pp.1395–1404.Google Scholar
  35. [35]
    Sánchez J, Aguayo F. Mobile messenger for the blind. In Universal Access in Ambient Intelligence Environments, Stephanidis C, Pieper M (eds.), Springer, pp.369–385.Google Scholar
  36. [36]
    Yfantidis G, Evreinov G. Adaptive blind interaction technique for touch screens. Universal Access in the Information Society, 2006, 4(4): 328–337.CrossRefGoogle Scholar
  37. [37]
    Azenkot S,Wobbrock J O, Prasain S, Ladner R E. Input finger detection for non-visual touch screen text entry in Perkinput. In Proc. Graphics Interface, May 2012, pp.121–129.Google Scholar
  38. [38]
    Buzzi M C, Buzzi M, Donini F et al. Haptic reference cues to support the exploration of touchscreen mobile devices by blind users. In Proc. CHIItaly, Sept. 2013, Article No. 28.Google Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.School of InformationKochi University of TechnologyKochiJapan

Personalised recommendations