Skip to main content
Log in

DiGVis: A system for comprehension and creation of directed graphs for the visually challenged

  • Long paper
  • Published:
Universal Access in the Information Society Aims and scope Submit manuscript

Abstract

One of the major barriers for the social inclusion of blind persons is the limited access to graphics-based learning resources which are highly vision oriented. This paper presents a cost-effective tool which facilitates comprehension and creation of virtual directed graphs, such as flowcharts, using alternate modalities of audio and touch. It provides a physically accessible virtual spatial workspace and multimodal interface to non-visually represent directed graphs in interactive manner. The concept of spatial query is used to aid exploration and mental visualization through audio and tactile feedback. A unique aspect of the tool, named DiGVis, offers compatible representations of directed graphs for the sighted and non-sighted persons. A study with 28 visually challenged subjects indicates that the tool facilitates comprehension of layout and directional connectivity of elements in a virtual diagram. Further, in a pilot study, blind persons could independently comprehend a virtual flowchart layout and its logical steps. They were also able to create the flowchart data without sighted assistance using DiGVis. A comparison with sighted subjects using DiGVis for similar task demonstrates the effectiveness of the technique for inclusive education.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Mayer, R.E., Gallini, J.K.: When is an illustration worth ten thousand words? J. Educ. Psychol. 82(4), 715–726 (1990)

    Article  Google Scholar 

  2. Larkin, J., Simon, H.A.: Why a diagram is (sometimes) worth 10,000 words? J. Cogn. Sci. 11(1), 65–99 (1987)

    Article  Google Scholar 

  3. Narayana, N.H., Suwa, M., Matoda, H.: Behaviour hypothesis from schematic diagrams. In: Glasgow, J., Narayan, H., Chandrasekaran, B. (eds.) Diagrammatic Reasoning: Cognitive and Computational Perspectives. AAAI Press, The MIT Press, Menlo Park (1995)

    Google Scholar 

  4. Hegarty, M.: Diagrams in the mind and in the world: relations between internal and external visualizations. In: The Proceedings of the International Conference on Diagrams, Cambridge, UK, pp. 1–13 (2004)

  5. Carney, R.N., Levin, J.R.: Pictorial illustrations still improve students’ learning from text. Educ. Psychol. Rev. 14(1), 5–26 (2002)

    Article  Google Scholar 

  6. Apkarian-Stielau, P., Loomis, J.M.: A comparison of tactile and blurred vision form perception. Percept. Psychophys. 18(5), 362–368 (1975)

  7. Kokjer, K.: The information capacity of the human fingertip. IEEE Trans. Syst. Man Cybern. 17(1), 100–102 (1987)

    Article  Google Scholar 

  8. Kosslyn, S.M.: Image and brain: the resolution of the imagery debate. M.I.T. Press, Cambridge (1994)

    Google Scholar 

  9. Hinton, R.: First introduction to tactile. Br. J. Vis. Impair. 9(3), 79–82 (1991)

    Article  Google Scholar 

  10. Shneiderman, B.: The eyes have it: a task by data type taxonomy for information visualizations. In: The Proceedings of the IEEE Symposium on Visual Languages, Boulder, Colorado, pp. 336–343 (1996)

  11. Asakawa, C., Takagi, H.: Annotations based transcoding for non-visual web access. In: The Proceedings of International ACM Conference on Assistive Technologies, Washington, DC, pp. 172–179 (2000)

  12. Kamel, H.M., Landay, J.A.: The use of labelling to communicate detailed graphics. In: The Proceedings of ACM Conference on Human Factors in Computing System, pp. 243–244 (2001)

  13. Syal, P., Chatterji, S., Sardana, H.K.: Parameters for representing graphics for the visually challenged: an exploratory investigation. Int. J. Disabil. Stud. 5, 6(1–4), 224–243 (2012)

    Google Scholar 

  14. Spencer, C., Morsley, K., Ungar, S., Pike, E., Blades, M.: Developing the blind child’s cognition of the environment: the role of direct and map-given experience. Geoforum 23(2), 191–197 (1992)

    Article  Google Scholar 

  15. Syal, P., Chatterji, S., Sardana, H.K.: Audio techniques for representing graphics for visually challenged persons—an overview. J. Eng. Technol. Educ. 5(2), 22–29 (2011)

    Google Scholar 

  16. Zhao, H., Plaisant, C., Shneiderman, B.: Improving accessibility and usability of geo-referenced statistical data. In: The Proceedings of International Conference on Digital Government, Boston, USA, pp. 1–5 (2003)

  17. Brewster, S.A., Capriotti, A., Hall, C.V.: Using compound earcons to represent hierarchies. Hum. Comput. Interact. Lett. 1(1), 6–8 (1998)

    Google Scholar 

  18. Walker, B.N., Nance, A., Lindsay, J.: Spearcons: speech-based earcons improve navigation performance in auditory menus. In: The Proceedings of International Conference on Auditory Display, London (2006)

  19. Metatla, O., Bryan-Kinns, N., Stockman, T.: Interactive hierarchy-based auditory displays for accessing and manipulating relational diagrams. J. Multimodal User Interfaces 5(3–4), 111–122 (2012)

    Article  Google Scholar 

  20. Yobas, L., Durand, D.M., Skebe, G.G., Lisy, F.J., Huff, M.A.: A novel integrable microvalve for refreshable braille display system. J. Microelectromech. Syst. 12(3), 252–263 (2003)

    Article  Google Scholar 

  21. Taylor, P.M., Mose, A., Creed, A.: The design and control of a tactile display based on shape memory alloys. In: Proceedings of the IEEE International Conference on Robotics and Automation, Albuquerque, New Mexico, pp. 1317–1323 (1997)

  22. Kaczmarek, K.A., Hasse, S.J.: Pattern identification and perceived stimulus quality as a function of stimulation waveform on a fingertip scanned electrotactile display. IEEE Trans. Neural Syst. Rehabil. Eng. 11(1), 9–16 (2003)

    Article  Google Scholar 

  23. Xu, C., Isar, A., Poupyrev, I., Bau, O.: Tactile display for the visually impaired using tesla touch. In: The Proceedings of the International Conference on Computer Human Interaction, Vancouver, Canada, (2011)

  24. Mynatt, E.D., Weber, G.: Nonvisual presentation of graphical user interfaces: contrasting two approaches. In: The Proceedings of International Conference on Human Factors in Computing Systems, Boston, USA, pp. 166–172 (1994)

  25. Edwards, K., Mynatt, E., Stockton, K.: Access to graphical interfaces for blind users. J. Interact. 2(1), 54–67 (1995)

    Article  Google Scholar 

  26. Parkes, D.: Nomad: an audio-tactile tool for the acquisition, use and management of spatially distributed information by visually impaired people. In: The Proceedings of the Second International Symposium on Maps and Graphics for Visually Impaired People, London, pp. 24–29 (1988)

  27. Wells, L.R., Landau, S.: Merging of tactile sensory input and audio data by means of the talking tactile tablet. In: The Proceedings of the International Conference on Eurohaptics, Dublin, Ireland, pp. 414–418 (2003)

  28. Krufka, S.E., Barner, K.E.: Automatic production of tactile graphics from scalable vector graphics. In: The Proceedings of International Conference on Computers and Accessibility, Baltimore, USA, pp. 166–174 (2005)

  29. Gardner, J.A., Bulatov, V.: Scientific diagrams made easy with IVEOTM. In: The Proceedings of International Conference on Computers Helping People with Special Needs, pp. 1243–1250 (2006)

  30. Fitzpatrick, D., McMullen, D.: Distance learning of graphically intensive material for visually impaired students. In: The Proceedings of International Conference on Computers Helping People with Special Needs, pp. 219–225 (2008)

  31. Blenkhorn, P., Evans, D.G.: Using speech and touch to enable blind people to access schematic diagrams. J. Netw. Comput. Appl. 21(1), 17–29 (1998)

    Article  Google Scholar 

  32. Cohen, R.F., Meacham, A., Scaff, J.: Teaching graphs to visually impaired students using an active auditory interface. In: The Proceedings of Technical Symposium on Computer Science Education, New York, pp. 279–282 (2006)

  33. Calder, M., Cohen, R.F., Lanzoni, J., Landry, N.,Skaff, J.: Teaching data structures to students who are blind. In: The Proceedings of International SIGCSE Conference on Innovation and Technology in Computer Science Education, Scotland, pp. 87–90 (2007)

  34. Okada, Y., Yamanaka, K.: Apparatus and method of assisting visually impaired persons to generate graphical data in a computer. US Patent No. 6140913, 31 Oct 2000

  35. Kobayashi, M., Watananabe, T.: A tactile display system equipped with a pointing device—MIMIZU. In: The Proceedings of International Conference on Computers Helping People with Special Needs, Linz, Austria, pp. 527–534 (2002)

  36. www.metec-ag.de

  37. Eramian, M., Jurgensen, S., Li, H, Power, C.: Talking tactile diagrams. Universal Access in HCI: Inclusive Design in the Information Society, In: The Proceedings of the International Conference on Human-Computer Interaction, Crete, Greece, pp. 1377–1381 (2003)

  38. Rotard, M., Taras, C., Ertl, T.: Tactile web browsing for blind people. J. Multimed. Tools Appl. 37(1), 53–69 (2008)

    Article  Google Scholar 

  39. www.hyperbraille.de

  40. Prescher, D., Weber, G., Spinder, M.: A tactile windowing system for blind users. In: The Proceedings of International Conference on Computers and Accessibility, Orlando, Florida, USA, pp. 91–98 (2010)

  41. Loitsch, C., Weber, G.: Viable haptic UML for blind people. In: The Proceedings of International Conference on Computers Helping People with Special Needs, Linz, Austria, pp. 509–516 (2012)

  42. Kildal, J., Brester, S.: Vibrotactile external memory aids in non-visual browsing of tabular data. In: Mcgookin, D.K., Brewster, S. A. (eds.) The Proceedings of International Conference on Haptic and Audio Interaction Design, Glasgow, UK, pp. 40–43 (2006)

  43. Evreinova, T.G., Evreinova, G., Raisamo, R., Vesterinen, L.K.: Non-visual interaction with graphs assisted with directional-predictive sounds and vibrations: a comparative study. J. Univers. Access Inf. Soc. 7(1–2), 93–102 (2008)

    Article  Google Scholar 

  44. Yu, W., Brewster, S.A.: Comparing two haptic interfaces for multimedia graph rendering. In: The Proceedings of IEEE International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Florida, USA, pp. 3–9 (2002)

  45. Yu, W., Kangas, K., Brewster, S.: Web-based haptic applications for blind people to create virtual graphs. In: The Proceedings of the International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Los Angeles, CA, USA, pp. 318–325 (2003)

  46. Jansson, G., Peterson, P.: Obtaining geographical information from a virtual map with a haptic mouse. In: The Proceedings of the International Cartographic Conference, Spain (2005)

  47. Wall, S., Brewster, S: Feeling what you hear: tactile feedback for navigation of audio graphs. In: The Proceedings of the International Conference on Human Computer Interaction, Montreal, Canada, pp. 1123–1132 (2006)

  48. Wall, S.A., Brewster, S.: Tac-Tiles: multimodal pie charts for visually impaired users. In: The Proceedings of the International Conference on Human Computer Interaction, Oslo, Norway, pp. 9–18 (2006)

  49. Rastogi, R., Pawluk, D.T., Ketchum, J.M.: Issues of using tactile mice by individuals who are blind and visually impaired. IEEE Trans. Neural Syst. Rehabil. Eng. 18(3), 311–318 (2010)

    Article  Google Scholar 

  50. Yu, W., Ramloll, R., Brewster, S.: Haptic graphs for blind users, In: The Proceedings of the International Workshop on Haptic Human Computer Interaction, Glasgow, UK, pp. 102–107 (2000)

  51. Ramloll, R., Yu, W., Brewster, S., Riedel, B., Burton, M., Dimigen, G.: Constructing sonified haptic line graphs for blind students: first steps. In: The Proceedings of International Conference on Assistive Technology, Virginia, USA (2000)

  52. Yu, W., Brewster, S. A.: Multimodal virtual reality versus printed medium in visualization for blind people. In: The Proceedings of International Conference on Assistive Technologies, Edinburgh, Scotland, pp. 57–64 (2002)

  53. Wall, S., Brewster, S.: Providing external memory aids in haptic visualizations for the blind computer users. In: The Proceedings of International Conference on Disability, Virtual Reality and Associated Technologies, Oxford, UK, pp. 157–164 (2004)

  54. Iyad, A.D., Enrico, P., Tran, C.S., Dominic, S., Ou, M.: Multimodal presentation of two-dimensional charts: an investigation using open office xml and microsoft excel. ACM Trans. Access. Comput. 3(2), 8:1–8:50 (2010)

  55. Jansson, G.: Basic issues concerning visually impaired people’s use of haptic displays, In: The Proceedings of the International Conference on Disability, Virtual Reality and Associated Technologies, Alghero, Sardinia, Italy, pp. 33–38 (2000)

  56. Jay, C., Stevens, R., Hubbold, R., Glencross, M.: Using haptic cues to aid non-visual structure recognition. ACM Trans. Appl. Percept. 5(2), 1–14 (2008)

    Article  Google Scholar 

  57. Syal, P., Chatterji, S., Sardana, H.K.: Spatial tactile apparatus and method for creation and navigation of virtual graphics for visually challenged. Patent application numbered 1739/DEL/2012, The Patent Office, India, 6 June 2012

  58. Syal, P., Chatterji, S., Sardana, H.K.: Virtual spatial workspace based representation of directed graphics for visually challenged. J. Eng. Sci. Manag. Educ. 6(1), 39–46 (2013)

    Google Scholar 

  59. Hart, S., Stevenland, L.: Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. In: Hancock, P., Meshkati, N. (eds.) Human Mental Workload, pp. 139–183. North Holland, Amsterdam (1998)

    Google Scholar 

Download references

Acknowledgments

The authors gratefully acknowledge the support extended to them for facilitating the conduct of studies by the officials of the Institute for Blind, Chandigarh, and the Asha Kiran Vocational Training Centre for Differently Abled, Chandigarh, India. The authors are extremely thankful to all the participants who participated in this research study.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Poonam Syal.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Syal, P., Chatterji, S. & Sardana, H.K. DiGVis: A system for comprehension and creation of directed graphs for the visually challenged. Univ Access Inf Soc 15, 199–217 (2016). https://doi.org/10.1007/s10209-014-0387-7

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10209-014-0387-7

Keywords

Navigation