Journal of Computer Science and Technology

, Volume 29, Issue 5, pp 825–836 | Cite as

Assisting Visually Impaired People to Acquire Targets on a Large Wall-Mounted Display

Regular Paper

Abstract

Large displays have become ubiquitous in our everyday lives, but these displays are designed for sighted people. This paper addresses the need for visually impaired people to access targets on large wall-mounted displays. We developed an assistive interface which exploits mid-air gesture input and haptic feedback, and examined its potential for pointing and steering tasks in human computer interaction (HCI). In two experiments, blind and blindfolded users performed target acquisition tasks using mid-air gestures and two different kinds of feedback (i.e., haptic feedback and audio feedback). Our results show that participants perform faster in Fitts’ law pointing tasks using the haptic feedback interface rather than the audio feedback interface. Furthermore, a regression analysis between movement time (MT) and the index of difficulty (ID) demonstrates that the Fitts’ law model and the steering law model are both effective for the evaluation of assistive interfaces for the blind. Our work and findings will serve as an initial step to assist visually impaired people to easily access required information on large public displays using haptic interfaces.

Keywords

haptic I/O auditory (non-speech) feedback interaction style human computer interaction visually impaired people 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Supplementary material

11390_2014_1471_MOESM1_ESM.pdf (83 kb)
ESM 1(PDF 84 kb)

References

  1. [1]
    Hinrichs U, Carpendale S, Valkanova N, Kuikkaniemi K, Jacucci G, Vande M A. Interactive public displays. IEEE Computer Graphics and Applications, 2013, 33(2): 25–27.CrossRefGoogle Scholar
  2. [2]
    Rashid U, Nacenta M A, Quigley A. Factors influencing visual attention switch in multi-display user interfaces: A survey. In Proc. the 1st Int. Symp. Pervasive Displays, June, 2012, pp.1–6.Google Scholar
  3. [3]
    Kane S, Morris M, Perkins A, Wigdor D, Ladner R E, Wobbrock J O. Access overlays: Improving non-visual access to large touch screens for blind users. In Proc. the 24th ACM Symp. User Interface Software and Technology (UIST), October 2011, pp.273–282.Google Scholar
  4. [4]
    Hoggan E, Crossan A, Brewster S, Kaaresoja T. Audio or tactile feedback: Which modality when? In Proc. the 27th SIGCHI Conf. Human Factors in Computing Systems (CHI), April 2009, pp.2253-2256.Google Scholar
  5. [5]
    Fitts P M. The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology, 1954, 47: 381–391.CrossRefGoogle Scholar
  6. [6]
    Accot J, Zhai S. Beyond Fitts' law: Models for trajectory-based HCI tasks. In Proc. the 15th SIGCHI Conf. Human Factors in Computing Systems (CHI), March 1997, pp.295–302.Google Scholar
  7. [7]
    Lehtinen V, Oulasvirta A, Salovaara A, Nurmi P. Dynamic tactile guidance for visual search tasks. In Proc. the 25th ACM Symp. User Interface Software and Technology (UIST), October 2012, pp.445–452.Google Scholar
  8. [8]
    Pirhonen A, Brewster S, Holguin C. Gestural and audio metaphors as a means of control for mobile devices. In Proc.the 20th SIGCHI Conf. Human Factors in Computing Systems (CHI), April 2002, pp.291–298.Google Scholar
  9. [9]
    Yuan B, Folmer E. Blind hero: Enabling guitar hero for the visually impaired. In Proc. the 10th ACM Conf. Computers and Accessibility (ASSETS), October 2008, pp.169–176.Google Scholar
  10. [10]
    Nomura Y, Yagi Y, Sugiura T, Matsui H, Kato N. A fingertip guiding manipulator for mental image creation of multi-stroke drawings. Microsyst. Technol., 2007, 13(8): 905–910.CrossRefGoogle Scholar
  11. [11]
    Bousbia-Salah M, Fezari M, Hamdi R. A navigation system for blind pedestrians. In Proc. the 16th IFAC World Congress, July 2005, pp.1401–1405.Google Scholar
  12. [12]
    Sánchez J, Sáenz M, Ripoll M. Usability of a multimodal videogame to improve navigation skills for blind children. In Proc. the 11th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2009, pp.35–42.Google Scholar
  13. [13]
    Tsetserukou D. FlexTorque, FlexTensor, and HapticEye: Exoskeleton haptic interfaces for augmented interaction. In Proc. the 2nd Int. Augmented Human Conf. (AH), March 2011, Article No. 33.Google Scholar
  14. [14]
    Heuten W, Henze N, Boll S, Pielot M. Tactile wayfinder: A non-visual support system for wayfinding. In Proc. the 5th Nordic Conf. Computer-Human Interaction: Building Bridges (NordiCHI), October 2008, pp.172–181.Google Scholar
  15. [15]
    Hub A, Diepstraten J, Ertl T. Design and development of an indoor navigation and object identification system for the blind. In Proc. the 6th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2004, pp.147–152.Google Scholar
  16. [16]
    Krishna S, Colbry D, Black J, Balasubramanian V, Panchanathan S. A systematic requirements analysis and development of an assistive device to enhance the social interaction of people who are blind or visually. In Proc. Workshop on Computer Vision Applications for the Visually Impaired (CVAVI 08), Oct. 2008.Google Scholar
  17. [17]
    Zeng L, Prescher D, Weber G. Exploration and avoidance of surrounding obstacles for the visually impaired. In Proc. the 14th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2012, pp.111–118.Google Scholar
  18. [18]
    Azenkot S, Fortuna E. Improving public transit usability for blind and deaf-blind people by connecting a braille display to a smartphone. In Proc. the 12th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2010, pp.317–318.Google Scholar
  19. [19]
    Southern C, Clawson J, Frey B, Abowd G, Romero M. An evaluation of BrailleTouch: Mobile touchscreen text entry for the visually impaired. In Proc. the 14th Int. Conf. Human-Computer Interaction with Mobile Devices and Services (MobileHCI), September 2012, pp.317–326.Google Scholar
  20. [20]
    Gutschmidt R, Schiewe M, Zinke F, JÄurgensen H. Haptic emulation of games: Haptic Sudoku for the blind. In Proc. the 3rd Int. Conf. Pervasive Technologies Related to Assistive Environments (PETRA), June 2010, Article No. 2.Google Scholar
  21. [21]
    Ferati M, Mannheimer S, Bolchini D. Usability evaluation of acoustic interfaces for the blind. In Proc. the 29th ACM Int. Conf. Design of Communication (SIGDOC), October 2011, pp.9–16.Google Scholar
  22. [22]
    Bonner M N, Brudvik J T, Abowd G D, Edwards W K. No-look notes: Accessible eyes-free multi-touch text entry. In Proc. the 8th Int. Conf. Pervasive Computing (Pervasive), May 2010, pp.409–426.Google Scholar
  23. [23]
    Douglas S A, Willson S. Haptic comparison of size (relative magnitude) in blind and sighted people. In Proc. the 9th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2007, pp.83–90.Google Scholar
  24. [24]
    Hribar V, Pawluk D. A tactile-thermal display for haptic exploration of virtual paintings. In Proc. the 13th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2011, pp.221–222.Google Scholar
  25. [25]
    Prescher D, Weber G, Spindler M. A tactile windowing system for blind users. In Proc. the 12th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2010, pp.91–98.Google Scholar
  26. [26]
    Azenkot S, Lee N B. Exploring the use of speech input by blind people on mobile devices. In Proc. the 15th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2013, Article No. 11.Google Scholar
  27. [27]
    Stephanidis C, Paramythis A, Karagiannidis C, Savidis A. Supporting interface adaptation: The AVANTI web-browser. In Proc. the 3rd ERCIM Workshop on User Interfaces for All (UI4ALL), Nov. 1997, pp.3–4.Google Scholar
  28. [28]
    Turunen M, Hakulinen J, Melto A et al. Speech-based and multimodal media center for di®erent user groups. In Proc. the 10th Annual Conf. Int. Speech Communication Association (INTERSPEECH), September 2009, pp.1439–1442.Google Scholar
  29. [29]
    Yamashita A, Kuno S, Kaneko T. Assisting system of visually impaired in touch panel operation using stereo camera. In Proc. the 18th IEEE Int. Conf. Image Processing (ICIP), September 2011, pp.985–988.Google Scholar
  30. [30]
    Yu W, Kangas K, Brewster S. Web-based haptic applications for blind people to create virtual graphs. In Proc. the 11th IEEE Symp. Haptic Interfaces for Virtual Environment and Teleoperator Systems (HAPTICS), March 2003, pp.318–325.Google Scholar
  31. [31]
    Zajicek M, Powell C, Reeves C. A web navigation tool for the blind. In Proc. the 3rd ACM Conf. Computers and Accessibility (ASSETS), April 1998, pp.204–206.Google Scholar
  32. [32]
    Kim C G, Song B S. Design of a wearable walking-guide system for the blind. In Proc. the 1st Int. Convention on Rehabilitation Engineering & Assistive Technology (i-CREATe), April 2007, pp.118–122.Google Scholar
  33. [33]
    Azenkot S, Prasain S, Borning A, Fortuna E, Ladner R E, Wobbrock J O. Enhancing independence and safety for blind and deaf-blind public transit riders. In Proc. the 29th SIGCHI Conf. Human Factors in Computing Systems (CHI), May 2011, pp.3247–3256.Google Scholar
  34. [34]
    Jeon M, Nazneen N, Akanser O, Ayala-Acevedo A, Walker B. Listen2dRoom: Helping blind individuals understand room layouts. In Proc. the 30th SIGCHI Conf. Human Factors in Computing Systems | Extended Abstract (CHI EA), May 2012, pp.1577–1582.Google Scholar
  35. [35]
    Brewster S, Brown L. Tactons: Structured tactile messages for non-visual information display. In Proc. the 5th Conf. Australasian User Interface (AUIC), Jan. 2004, pp.18–22.Google Scholar
  36. [36]
    Shoemaker G, Tsukitani T, Kitamura Y, Booth K S. Two-part models capture the impact of gain on pointing performance. ACM TOCHI, 2012, 19(4): Article No. 28.Google Scholar
  37. [37]
    Goncu C, Marriott K. GraVVITAS: Generic multi-touch presentation of accessible graphics. In Proc. the 13th Int. Conf. Human-Computer Interaction (INTERACT), September 2011, pp.30–48.Google Scholar
  38. [38]
    Woodworth R S. The accuracy of voluntary movement. Psychological Monographs, 1970, 3(3): 1–114.CrossRefGoogle Scholar
  39. [39]
    Carlton L G. Control processes in the production of discrete aiming responses. Journal of Human Movement Studies, 1979, 5: 115–124.Google Scholar
  40. [40]
    Bahram S, Chakraborty A, Amant R S. CAVIAR: A vibrotactile device for accessible reaching. In Proc. ACM Int. Conf. Intelligent User Interfaces (IUI), February 2012, pp.245–248.Google Scholar
  41. [41]
    Fiannaca A, Morelli T, Folmer E. Haptic target acquisition to enable spatial gestures in nonvisual displays. In Proc. the 39th Graphics Interface (GI), May 2013, pp.213–219.Google Scholar
  42. [42]
    Folmer E, Morelli T. Spatial gestures using a tactile- proprioceptive display. In Proc. the 6th Int. Conf. Tangible, Embedded and Embodied Interaction (TEI), February 2012, pp.139–142.Google Scholar
  43. [43]
    Li F C Y, Dearman D, Truong K N. Leveraging proprioception to make mobile phones more accessible to users with visual impairments. In Proc. the 12th ACM SIGACCESS Conf. Computers and Accessibility (ASSETS), October 2010, pp.187–194.Google Scholar
  44. [44]
    Li F C Y, Dearman D, Truong K N. Virtual shelves: Interactions with orientation aware devices. In Proc. the 22nd ACM Symp. User Interface Software and Technology (UIST), October 2009, pp.125–128.Google Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.Center for Human Computer InteractionKochi University of TechnologyKochiJapan

Personalised recommendations