Advertisement

Investigating Feedback for Two-Handed Exploration of Digital Maps Without Vision

  • Sandra Bardot
  • Marcos Serrano
  • Simon Perrault
  • Shengdong Zhao
  • Christophe JouffraisEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11746)

Abstract

Digital Interactive Maps on touch surfaces are a convenient alternative to physical raised-line maps for users with visual impairments. To compensate for the absence of passive tactile information, they provide vibrotactile and auditory feedback. However, this feedback is ambiguous when using multiple fingers since users may not identify which finger triggered it. To address this issue, we explored the use of bilateral feedback, i.e. collocated with each hand, for two-handed map exploration. We first introduced a design space of feedback for two-handed interaction combining two dimensions: spatial location (unilateral vs. bilateral feedback) and similarity (same vs. different feedback). We implemented four techniques resulting from our design space, using one or two smartwatches worn on the wrist (unilateral vs. bilateral feedback respectively). A first study with fifteen blindfolded participants showed that bilateral feedback outperformed unilateral feedback and that feedback similarity has little influence on exploration performance. Then we did a second study with twelve users with visual impairments, which confirmed the advantage of two-handed vs. one-handed exploration, and of bilateral vs. unilateral feedback. The results also bring to light the impact of feedback on exploration strategies.

Keywords

Users with visual impairment Accessibility Wearable devices Smartwatches Multimodal feedback Map exploration 

Notes

Acknowledgments

We thank all the users who participated to the studies. We also thank the special education Center IJA, and the “Cherchons pour Voir” lab, both in Toulouse, FR. This work was part of the AccessiMap project (research grant AccessiMap ANR-14-CE17-0018).

Supplementary material

Supplementary material 1 (MP4 36468 kb)

486811_1_En_19_MOESM2_ESM.pdf (642 kb)
Supplementary material 2 (PDF 641 kb)

References

  1. 1.
    Arons, B.: A review of the cocktail party effect. J. Am. Voice I/O Soc. 12(7), 35–50 (1992)Google Scholar
  2. 2.
    Baker, M.: Statisticians issue warning over misuse of P values. Nature 531(7593), 151 (2015).  https://doi.org/10.1038/nature.2016.19503CrossRefGoogle Scholar
  3. 3.
    Bardot, S., Serrano, M., Jouffrais, C.: From tactile to virtual: using a smartwatch to improve spatial map exploration for visually impaired users. In: Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI 2016), pp. 100–111. ACM, New York (2016).  https://doi.org/10.1145/2935334.2935342
  4. 4.
    Bardot, S., Serrano, M., Oriola, B., Jouffrais, C.: Identifying how visually impaired people explore raised-line diagrams to improve the design of touch interfaces. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI 2017), pp. 550–555. ACM, New York (2017).  https://doi.org/10.1145/3025453.3025582
  5. 5.
    Brock, A., Lebaz, S., Oriola, B., Picard, D., Jouffrais, C., Truillet, P.: Kin’touch: understanding how visually impaired people explore tactile maps. In: CHI 2012 Extended Abstracts on Human Factors in Computing Systems (CHI EA 2012), pp. 2471–2476. ACM, New York (2012).  https://doi.org/10.1145/2212776.2223821
  6. 6.
    Brock, A.M., Truillet, P., Oriola, B., Picard, D., Jouffrais, C.: Interactivity improves usability of geographic maps for visually impaired people. Hum. Comput. Interact. 30(2), 156–194 (2015).  https://doi.org/10.1080/07370024.2014.924412CrossRefGoogle Scholar
  7. 7.
    Chakraborty, T., Khan, T.A., Alim Al Islam, A.B.M.: FLight: a low-cost reading and writing system for economically less-privileged visually-impaired people exploiting ink-based Braille system. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI 2017), pp. 531–540. ACM, New York (2017).  https://doi.org/10.1145/3025453.3025646
  8. 8.
    Carroll, D., Chakraborty, S., Lazar, J.: Designing accessible visualizations: the case of designing a weather map for blind users. In: Stephanidis, C., Antona, M. (eds.) UAHCI 2013, Part I. LNCS, vol. 8009, pp. 436–445. Springer, Heidelberg (2013).  https://doi.org/10.1007/978-3-642-39188-0_47CrossRefGoogle Scholar
  9. 9.
    Cumming, G.: The new statistics: why and how. Psychol. Sci. 25(1), 7–29 (2014).  https://doi.org/10.1177/0956797613504966CrossRefGoogle Scholar
  10. 10.
    Delogu, F., Palmiero, M., Federici, S., Plaisant, C., Zhao, H., Belardinelli, O.: Non-visual exploration of geographic maps: does sonification help? Disabil. Rehabil. Assist. Technol. 5(3), 164–174 (2010).  https://doi.org/10.3109/17483100903100277CrossRefGoogle Scholar
  11. 11.
    Dragicevic, P.: Fair statistical communication in HCI. In: Robertson, J., Kaptein, M. (eds.) Modern Statistical Methods for HCI. HIS, pp. 291–330. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-26633-6_13CrossRefGoogle Scholar
  12. 12.
    Dragicevic, P., Chevalier, F., Huot, S.: Running an HCI experiment in multiple parallel universes. In: Extended Abstracts on Human Factors in Computing Systems, pp. 607–618. ACM, New York (2014). http://dx.doi.org/10.1145/2559206.2578881
  13. 13.
    Ducasse, J., Brock, A.M., Jouffrais, C.: Accessible interactive maps for visually impaired users. In: Pissaloux, E., Velázquez, R. (eds.) Mobility of Visually Impaired People, pp. 537–584. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-54446-5_17CrossRefGoogle Scholar
  14. 14.
    Fortune, S.: A sweepline algorithm for Voronoi diagrams. Algorithmica 2(1–4), 153–174 (1987)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Gaunet, F., Martinez, J.L., Thinus-Blanc, C.: Early-blind subjects’ spatial representation of manipulatory space: exploratory strategies and reaction to change. Perception 26(3), 345–366 (1997)CrossRefGoogle Scholar
  16. 16.
    Giudice, N.A., Palani, H.P., Brenner, E., Kramer, K.M.: Learning non-visual graphical information using a touch-based vibro-audio interface. In: Proceedings of the 14th International ACM SIGACCESS Conference on Computers and Accessibility - ASSETS 2012, pp. 103–110. ACM Press, New York (2012).  https://doi.org/10.1145/2384916.2384935
  17. 17.
    Goncu, C., Marriott, K.: GraVVITAS: generic multi-touch presentation of accessible graphics. In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M. (eds.) INTERACT 2011. LNCS, vol. 6946, pp. 30–48. Springer, Heidelberg (2011).  https://doi.org/10.1007/978-3-642-23774-4_5CrossRefGoogle Scholar
  18. 18.
    Guerreiro, J., Gonçalves, D.: Text-to-speeches: evaluating the perception of concurrent speech by blind people. In: Proceedings of the 16th International ACM SIGACCESS Conference on Computers & Accessibility (ASSETS 2014), pp. 169–176. ACM, New York (2014).  https://doi.org/10.1145/2661334.2661367
  19. 19.
    Guerreiro, T., Montague, K., Guerreiro, J., Nunes, R., Nicolau, H., Gonçalves, D.J.V.: Blind people interacting with large touch surfaces: strategies for one-handed and two-handed exploration. In: Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces (ITS 2015), pp. 25–34. ACM, New York (2015).  https://doi.org/10.1145/2817721.2817743
  20. 20.
    Gupta, A., Balakrishnan, R.: DualKey: miniature screen text entry via finger identification. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems - CHI 2016, pp. 59–70 (2016).  https://doi.org/10.1145/2858036.2858052
  21. 21.
    Hart, S.G., Staveland, L.E.: Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. In: Hancock, P.A., Meshkati, N. (eds.) Human Mental Workload, pp. 139–183. Elsevier (1988). http://doi.org/10.1016/S0166–4115(08)62386–9
  22. 22.
    Hawley, M.L., Litovsky, R.Y., Culling, J.F.: The benefit of binaural hearing in a cocktail party: effect of location and type of interferer. J. Acoust. Soc. Am. 115(2), 833–843 (2004)CrossRefGoogle Scholar
  23. 23.
    Hill, E.W., Rieser, J.J., Hill, M.M., Hill, M.: How persons with visual impairments explore novel spaces: strategies of good and poor performers. J. Vis. Impair. Blind. 87, 295–301 (1993)Google Scholar
  24. 24.
    Horn, M.T.: TopCode: Tangible Object Placement Codes. http://hci.cs.tufts.edu/topcodes/
  25. 25.
    Kaklanis, N., Votis, K., Tzovaras, D.: Open touch/sound maps: a system to convey street data through haptic and auditory feedback. Comput. Geosci. 57, 59–67 (2013)CrossRefGoogle Scholar
  26. 26.
    Kane, S.K., et al.: Access overlays: improving non-visual access to large touch screens for blind users. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (UIST 2011), pp. 273–282. ACM, New York (2011).  https://doi.org/10.1145/2047196.2047232
  27. 27.
    Kane, S.K., Wobbrock, J.O., Ladner, R.E.: Usable gestures for blind people: understanding preference and performance. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2011), pp. 413–422. ACM, New York (2011).  https://doi.org/10.1145/1978942.1979001
  28. 28.
    Lahav, O., Mioduser, D.: Haptic-feedback support for cognitive mapping of unknown spaces by people who are blind. Int. J. Hum. Comput. Stud. 66(1), 23–35 (2008)CrossRefGoogle Scholar
  29. 29.
    Loomis, J.M., Klatzky, R.L., Lederman, S.J.: Similarity of tactual and visual picture recognition with limited field of view. Perception 20(2), 167–177 (1991).  https://doi.org/10.1068/p200167CrossRefGoogle Scholar
  30. 30.
    McGookin, D., Brewster, S., Jiang, W.: Investigating touchscreen accessibility for people with visual impairments. In: Proceedings of the 5th Nordic Conference on Human-Computer Interaction: Building Bridges (NordiCHI 2008), pp. 298–307. ACM, New York (2008). http://dx.doi.org/10.1145/1463160.1463193
  31. 31.
    Morash, V.S., Connell Pensky, A.E., Tseng, S.T.W., Miele, J.A.: Effects of using multiple hands and fingers on haptic performance in individuals who are blind. Perception 43(6), 569–588 (2014)CrossRefGoogle Scholar
  32. 32.
    Moray, N.: Attention in dichotic listening: Affective cues and the influence of instructions. Q. J. Exp. Psychol. 11, 56–60 (1959)CrossRefGoogle Scholar
  33. 33.
    Poppinga, B., Magnusson, C., Pielot, M., Rassmus-Gröhn, K.: TouchOver map: audio-tactile exploration of interactive maps. In: Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services (MobileHCI 2011), pp. 545–550. ACM, New York (2011).  https://doi.org/10.1145/2037373.2037458
  34. 34.
    Ramloll, R., Brewster, S.: A generic approach for augmenting tactile diagrams with spatial non-speech sounds. In: CHI 2002 Extended Abstracts on Human Factors in Computing Systems - CHI 2002, p. 770. ACM Press, New York (2002). http://doi.org/10.1145/506443.506589
  35. 35.
    Sears, A., Hanson, V.: Representing users in accessibility research. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2011), pp. 2235–2238. ACM, New York (2011).  https://doi.org/10.1145/1978942.1979268
  36. 36.
    Shilkrot, R., Huber, J., Ee, W.M., Maes, P., Nanayakkara, S.C.: FingerReader: a wearable device to explore printed text on the go. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI 2015), pp. 2363–2372. ACM, New York (2015).  https://doi.org/10.1145/2702123.2702421
  37. 37.
    Simonnet, M., Ryall, E.: Blind sailors’ spatial representation using an on-board force feedback arm: two case studies. Adv. Hum. Comput. Interact. 2013, 1 (2013).  https://doi.org/10.1155/2013/163718. Article 10CrossRefGoogle Scholar
  38. 38.
    Simonnet, M., Vieilledent, S.: Accuracy and coordination of spatial frames of reference during the exploration of virtual maps: interest for orientation and mobility of blind people?”. Adv. Hum. Comput. Interact. 2012, 14 (2012)CrossRefGoogle Scholar
  39. 39.
    Su, J., Rosenzweig, A., Goel, A., de Lara, E., Truong, K.N.: Timbremap: enabling the visually-impaired to use maps on touch-enabled devices. In: Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services (MobileHCI 2010), pp. 17–26. ACM, New York (2010).  https://doi.org/10.1145/1851600.1851606
  40. 40.
    VandenBos, G.R. (ed.): Publication Manual of the American Psychological Association, 6th edn. American Psychological Association, Washington, DC (2009). http://www.apastyle.org/manual/Google Scholar
  41. 41.
    Wang, F., Cao, X., Ren, X., Irani, P.: Detecting and leveraging finger orientation for interaction with direct-touch surfaces. In: Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology (UIST 2009), pp. 23–32. ACM, New York (2009).  https://doi.org/10.1145/1622176.1622182
  42. 42.
    Wall, S., Brewster, S.: Providing external memory aids in haptic visualisations for blind computer users. Int. J. Disabil. Hum. Dev. 4(4), 331–338 (2011).  https://doi.org/10.1515/IJDHD.2005.4.4.331. Accessed 19 Sep 2018CrossRefGoogle Scholar
  43. 43.
    Wijntjes, M.W.A., van Lienen, T., Verstijnen, I.M., Kappers, A.M.L.: Look what I have felt: unidentified haptic line drawings are identified after sketching. Acta Psychol. (Amst) 128(2), 255–263 (2008)CrossRefGoogle Scholar

Copyright information

© IFIP International Federation for Information Processing 2019

Authors and Affiliations

  • Sandra Bardot
    • 1
  • Marcos Serrano
    • 1
  • Simon Perrault
    • 2
  • Shengdong Zhao
    • 3
  • Christophe Jouffrais
    • 1
    • 4
    • 5
    Email author
  1. 1.IRIT, University of ToulouseToulouseFrance
  2. 2.Singapore University of Technology and Design (SUTD)SingaporeSingapore
  3. 3.NUS-HCI LabNational University of SingaporeSingaporeSingapore
  4. 4.IRIT, CNRSToulouseFrance
  5. 5.IPAL, CNRSSingaporeSingapore

Personalised recommendations