Advertisement

Multisensory Immersive Analytics

  • Jon McCormack
  • Jonathan C. Roberts
  • Benjamin Bach
  • Carla Dal Sasso Freitas
  • Takayuki Itoh
  • Christophe Hurter
  • Kim Marriott
Chapter
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11190)

Abstract

While visual cues are traditionally used for visual analytics, multimodal interaction technologies offer many new possibilities. This chapter explores the opportunities and challenges for developers and users to utilize and represent data through non-visual sensory channels to help them understand and interact with data. Users are able to experience data in new ways: variables from complex datasets can be conveyed through different senses; presentations are more accessible to people with vision impairment and can be personalized to specific user needs; interactions can involve multiple senses to provide natural and transparent methods. All these techniques enable users to obtain a better understanding of the underlying information. While the emphasis of this chapter is towards non-visual immersive analytics, we include a discussion on how visual presentations are integrated with different modalities, and the opportunities of mixing several sensory signals, including the visual domain.

Keywords

Immersive visual analytics Multisensory visualization Haptic data visualization 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Auffarth, B.: Understanding smell - the olfactory stimulus problem. Neurosci. Biobehav. Rev. 37(8), 1667–1679 (2013)CrossRefGoogle Scholar
  2. 2.
    Azuma, R.T.: A survey of augmented reality. Presence: teleoperators Virtual Environ. 6(4), 355–385 (1997)CrossRefGoogle Scholar
  3. 3.
    Ball, R., North, C., Bowman, D.A.: Move to improve: promoting physical navigation to increase user performance with large displays. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. CHI 2007, pp. 191–200. ACM (2007)Google Scholar
  4. 4.
    Barrett, N.: Interactive spatial sonification of multidimensional data for composition and auditory display. Comput. Music J. 40(2), 47–69 (2016)CrossRefGoogle Scholar
  5. 5.
    Basdogan, C., De, S., Kim, J., Muniyandi, M., Kim, H., Srinivasan, M.A.: Haptics in minimally invasive surgical simulation and training. IEEE Comput. Graph. Appl. 24(2), 56–64 (2004)CrossRefGoogle Scholar
  6. 6.
    Bau, O., Poupyrev, I., Israr, A., Harrison, C.: Teslatouch: electrovibration for touch surfaces. In: Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology, pp. 283–292. ACM (2010)Google Scholar
  7. 7.
    Baum, G., Gotsis, M., Chang, C., Drinkwater, R., Clair, D.S.: Synthecology: sound use of audio in teleimmersion. In: Proceedings Stereoscopic Displays and Virtual Reality Systems XIII, vol. 6055. SPIE the Engineering Reality of Virtual Reality (2006)Google Scholar
  8. 8.
    Benali-khoudja, M., Hafez, M., marc Alex, J., Kheddar, A.: Tactile interfaces: a state-of-the-art survey. In: International Symposium on Robotics, pp. 721–726 (2004)Google Scholar
  9. 9.
    Bertin, J.: Sémiologie graphique: Les diagrammes - Les réseaux - Les cartes. Editions de l’Ecole Hautes Etudes en Sciences, Paris, France, les réimpressions edn (1967)Google Scholar
  10. 10.
    Bezerianos, A., Isenberg, P.: Perception of visual variables on tiled wall-sized displays for information visualization applications. IEEE Trans. Vis. Comput. Graph. 18(12), 2516–2525 (2012)CrossRefGoogle Scholar
  11. 11.
    Bloomfield, A., Badler, N.I.: Virtual training via vibrotactile arrays. Presence: Teleoperators Virtual Environ. 17(2), 103–120 (2008)CrossRefGoogle Scholar
  12. 12.
    Bogue, R.: Exoskeletons and robotic prosthetics: a review of recent developments. Ind. Robot: Int. J. 36(5), 421–427 (2009)CrossRefGoogle Scholar
  13. 13.
    Bowman, D.A., Kruijff, E., LaViola, J., Poupirev, I.: User Interfaces - Theory and Practice. Addison Wesley, Boston (2005)Google Scholar
  14. 14.
    Brown, C., Hurst, A.: Viztouch: automatically generated tactile visualizations of coordinate spaces. In: Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction, pp. 131–138. ACM (2012)Google Scholar
  15. 15.
    Brown, L.M., Brewster, S.A., Ramloll, S., Burton, R., Riedel, B.: Design guidelines for audio presentation of graphs and tables. In: International Conference on Auditory Display (2003)Google Scholar
  16. 16.
    Burdea, G.C.: Force and Touch Feedback for Virtual Reality. Wiley, New York (1996)Google Scholar
  17. 17.
    Calvert, G., Spence, C., Stein, B.E.: The Handbook of Multisensory Processes. MIT Press, Cambridge (2004)Google Scholar
  18. 18.
    Card, S.K., Mackinlay, J.D., Shneiderman, B. (eds.): Readings in Information Visualization: Using Vision to Think. Morgan Kaufmann Publishers, San Francisco (1999)Google Scholar
  19. 19.
    Carpendale, M.: Considering Visual Variables as a Basis for Information Visualisation (2003)Google Scholar
  20. 20.
    Carter, T., Seah, S.A., Long, B., Drinkwater, B., Subramanian, S.: Ultrahaptics: multi-point mid-air haptic feedback for touch surfaces. In: Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology, pp. 505–514. ACM (2013)Google Scholar
  21. 21.
    Chen, H., Wu, W., Sun, H., Heng, P.A.: Dynamic touch-enabled virtual palpation. Comput. Animat. Virtual Worlds 18(4–5), 339–348 (2007)CrossRefGoogle Scholar
  22. 22.
    Cini, G., Frisoli, A., Marcheschi, S., Salsedo, F., Bergamasco, M.: A novel fingertip haptic device for display of local contact geometry. In: Proceedings of the First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pp. 602–605. IEEE Computer Society (2005)Google Scholar
  23. 23.
    Cirio, G., Marchal, M., Hillaire, S., Lecuyer, A.: Six degrees-of-freedom haptic interaction with fluids. IEEE Trans. Vis. Comput. Graph. 17(11), 1714–1727 (2011)CrossRefGoogle Scholar
  24. 24.
    Cleveland, W.S., McGill, R.: Graphical perception: theory, experimentation and application to the development of graphical methods. J. Am. Stat. Assoc. 79(387), 531–554 (1984)CrossRefGoogle Scholar
  25. 25.
    Coles, T.R., Meglan, D., John, N.W.: The role of haptics in medical training simulators: a survey of the state of the art. IEEE Trans. Haptics 4(1), 51–66 (2011)CrossRefGoogle Scholar
  26. 26.
    Cordeil, M., Dwyer, T., Hurter, C.: Immersive solutions for future air traffic control and management. In: Proceedings of the 2016 ACM Companion on Interactive Surfaces and Spaces, ISS Companion 2016, pp. 25–31. ACM, New York (2016)Google Scholar
  27. 27.
    Crider, M., Bergner, S., Smyth, T.N., Möller, T., Tory, M.K., Kirkpatrick, A.E., Weiskopf, D.: A mixing board interface for graphics and visualization applications. In: Proceedings of Graphics Interface, pp. 87–94 (2007)Google Scholar
  28. 28.
    Cunningham, J.P., Ghahramani, Z.: Linear dimensionality reduction: survey, insights, and generalizations. J. Mach. Learn. Res. 16(1), 2859–2900 (2015)MathSciNetzbMATHGoogle Scholar
  29. 29.
    Dargar, S., De, S., Sankaranarayanan, G.: Development of a haptic interface for natural orifice translumenal endoscopic surgery simulation. IEEE Trans. Haptics 9(3), 333–344 (2016)CrossRefGoogle Scholar
  30. 30.
    Dingler, T., Brewster, S., Butz, A.: Audiofeeds - a mobile auditory application for monitoring online activities. In. In: Proceedings of ACM Multimedia. ACM Press, Florence (2010)Google Scholar
  31. 31.
    Engel, D., Hüttenberger, L., Hamann, B.: A survey of dimension reduction methods for high-dimensional data analysis and visualization. In: VLUDS (2011)Google Scholar
  32. 32.
    Few, S. (ed.): Information Dashboard Design. The Effective Visual Communication of Data. Analytics Press, Berkeley (2006)Google Scholar
  33. 33.
    Flowers, J.H., Buhman, D.C., Turnage, K.D.: Cross-modal equivalence of visual and auditory scatterplots for exploring bivariate data samples. Hum. Factors: J. Hum. Factors Ergon. Soc. 39(3), 341–351 (1997)CrossRefGoogle Scholar
  34. 34.
    Foley, H., Matlin, M.: Sensation and Perception. Psychology Press, Abingdon (2015)CrossRefGoogle Scholar
  35. 35.
    Follmer, S., Leithinger, D., Olwal, A., Hogge, A., Ishii, H.: inFORM: dynamic physical affordances and constraints through shape and object actuation. ACM Symp. User Interface Softw. Technol. 13, 417–426 (2013)Google Scholar
  36. 36.
    Franklin, K.M., Roberts, J.C.: Pie chart sonification. In: Proceedings of the Seventh International Conference on Information Visualization, pp. 4–9. IEEE Computer Society, Washington, DC (2003)Google Scholar
  37. 37.
    Gaver, W.W.: What in the world do we hear?: an ecological approach to auditory event perception. Ecol. Psychol. 5(1), 1–29 (1993)CrossRefGoogle Scholar
  38. 38.
    Goncu, C., Marriott, K.: GraVVITAS: generic multi-touch presentation of accessible graphics. In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M. (eds.) INTERACT 2011. LNCS, vol. 6946, pp. 30–48. Springer, Heidelberg (2011).  https://doi.org/10.1007/978-3-642-23774-4_5CrossRefGoogle Scholar
  39. 39.
    Goodwin, A.W., Wheat, H.E.: Physiological mechanisms of the receptor system. In: Grunwald, M. (ed.) Human Haptic Perception: Basics and Applications, pp. 93–102. Birkhäuser Basel, Basel (2008)CrossRefGoogle Scholar
  40. 40.
    Grey, J.M.: Multidimensional perceptual scaling of musical timbres. J. Acoust. Soc. Am. 61(5), 1270–1277 (1977)CrossRefGoogle Scholar
  41. 41.
    Hallowell, E.M.: Overloaded circuits. Harvard Business Review, p. 11 (2005)Google Scholar
  42. 42.
    Hayward, V., Maclean, K.E.: Do it yourself haptics: Part I. IEEE Robot. Autom. Mag. 14(4), 88–104 (2007)CrossRefGoogle Scholar
  43. 43.
    Heng, P.A., Cheng, C.Y., Wong, T.T., Xu, Y., Chui, Y.P., Chan, K.M., Tso, S.K.: A virtual-reality training system for knee arthroscopic surgery. Trans. Info. Tech. Biomed. 8(2), 217–227 (2004)CrossRefGoogle Scholar
  44. 44.
    Heng, P.A., Wong, T.T., Yang, R., Chui, Y.P., Xie, Y.M., Leung, K.S., Leung, P.C.: Intelligent inferencing and haptic simulation for chinese acupuncture learning and training. IEEE Trans. Inf. Technol. Biomed. 10(1), 28–41 (2006)CrossRefGoogle Scholar
  45. 45.
    Hensel, H.: Cutaneous thermoreceptors. In: Iggo, A. (ed.) Somatosensory System, pp. 79–110. Springer, Heidelberg (1973)CrossRefGoogle Scholar
  46. 46.
    Hermann, T., Hunt, A., Neuhoff, J.G.: The Sonification Handbook. Logos Publishing House, Berlin (2011)Google Scholar
  47. 47.
    Hermann, T.: Taxonomy and definitions for sonification and auditory display. International Community for Auditory Display (2008)Google Scholar
  48. 48.
    Hevner, K.: Experimental studies of the elements of expression in music. Am. J. Psychol. 48(2), 246–268 (1936)CrossRefGoogle Scholar
  49. 49.
    Hoggan, E., Brewster, S.: Crosstrainer: Testing the use of multimodal interfaces in situ. In: Proceedings of the ACM Conference on Human Factors in Computing Systems, pp. 333–342. ACM Press (2010)Google Scholar
  50. 50.
    Hoshi, T., Takahashi, M., Iwamoto, T., Shinoda, H.: Noncontact tactile display based on radiation pressure of airborne ultrasound. IEEE Trans. Haptics 3(3), 155–165 (2010)CrossRefGoogle Scholar
  51. 51.
    Howes, D.: Cross-talk between the senses. Senses Soc. 1(3), 381–390 (2006)CrossRefGoogle Scholar
  52. 52.
    Hu, M.: Exploring new paradigms for accessible 3D printed graphs. In: Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility, pp. 365–366. ACM (2015)Google Scholar
  53. 53.
    Jang, S., Kim, L.H., Tanner, K., Ishii, H., Follmer, S.: Haptic edge display for mobile tactile interaction. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI 2016, pp. 3706–3716 (2016)Google Scholar
  54. 54.
    Jansen, Y., et al.: Opportunities and challenges for data physicalization. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI 2015, pp. 3227–3236. ACM (2015)Google Scholar
  55. 55.
    de Jesus Oliveira, V.A., Brayda, L., Nedel, L., Maciel, A.: Designing a vibrotactile head-mounted display for spatial awareness in 3D spaces. IEEE Trans. Vis. Comput. Graph. 23(4), 1409–1417 (2017)CrossRefGoogle Scholar
  56. 56.
    Jones, L.A., Berris, M.: The psychophysics of temperature perception and thermal-interface design. In: Proceedings of 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, p. 137. IEEE Computer Society (2002)Google Scholar
  57. 57.
    Keim, D., Andrienko, G., Fekete, J.D., Görg, C., Kohlhammer, J., Melançon, G.: Visual analytics: definition, process, and challenges. In: Kerren, A., Stasko, J.T., Fekete, J.D., North, C. (eds.) Information Visualization: Human-Centered Issues and Perspectives, pp. 154–175. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  58. 58.
    Keim, D.A., Mansmann, F., Schneidewind, J., Thomas, J., Ziegler, H.: Visual analytics: scope and challenges. Lect. Notes Comput. Sci., Vis. Data Min. 4404, 76–90 (2008)CrossRefGoogle Scholar
  59. 59.
    Kendrew, J.C., Bodo, G., Dintzis, H.M., Parrish, R., Wyckoff, H., Phillips, D.C.: A three-dimensional model of the myoglobin molecule obtained by x-ray analysis. Nature 181(4610), 662–666 (1958)CrossRefGoogle Scholar
  60. 60.
    Klapperstueck, M., Czauderna, T., Goncu, C., Glowacki, J., Dwyer, T., Schreiber, F., Marriott, K.: ContextuWall: peer collaboration using (large) displays. In: 2016 Big Data Visual Analytics (BDVA), pp. 1–8 (2016)Google Scholar
  61. 61.
    Klatzky, R.L., Lederman, S.J., Metzger, V.A.: Identifying objects by touch: An "expert system". Percept. Psychophys. 37(4), 299–302 (1985)CrossRefGoogle Scholar
  62. 62.
    Klingberg, T.: The Overflowing Brain. Information Overload and the Limits of Working Memory. Oxford University Press, Oxford (2009)Google Scholar
  63. 63.
    Köbben, B., Yaman, M.: Evaluating dynamic visual variables. In: Proceedings of the Seminar on Teaching Animated Cartography, ACI/ICA, Madrid, pp. 45–51 (1996)Google Scholar
  64. 64.
    Kolarik, A.J., Cirstea, S., Pardhan, S., Moore, B.C.: A summary of research investigating echolocation abilities of blind and sighted humans. Hear. Res. 310, 60–68 (2014)CrossRefGoogle Scholar
  65. 65.
    Koulakov, A.: In search of the structure of human olfactory space. Flavour 3(1), O1 (2014)CrossRefGoogle Scholar
  66. 66.
    Kramer, G.: Mapping a single data stream to multiple auditory variables: a subjective approach to creating a compelling design. In: International Conference on Auditory Displays (1996)Google Scholar
  67. 67.
    Kramer, G.: Auditory display. Sonification, audification, and auditory interfaces. Perseus Publishing, New York City (1993)Google Scholar
  68. 68.
    Kramer, G., et al.: Sonification report: status of the field and research agenda (1999)Google Scholar
  69. 69.
    Kramer, G., et al.: Sonification report: status of the field and research agenda (2010)Google Scholar
  70. 70.
    van Krevelen, R., Poelman, R.: A survey of augmented reality: technologies, applications and limitations. Int. J. Virtual Real. 9(2), 1–20 (2010)Google Scholar
  71. 71.
    Landau, S., Gourgey, K.: Development of a talking tactile tablet. Inf. Technol. Disabil. 7(2) (2001)Google Scholar
  72. 72.
    Lederman, S.J., Campbell, J.I.: Tangible graphs for the blind. Hum. Factors 24(1), 85–100 (1982)CrossRefGoogle Scholar
  73. 73.
    Ludovico, L.A., Presti, G.: The sonification space: a reference system for sonification tasks. Int. J. Hum.-Comput. Stud. 85, 72–77 (2016)CrossRefGoogle Scholar
  74. 74.
    Mackinlay, J.: Automating the design of graphical presentations of relational information. ACM Trans. Graph. (ToG) 5(2), 110–141 (1986)CrossRefGoogle Scholar
  75. 75.
    Maclean, K.E., Hayward, V.: Do it yourself haptics: Part II [Tutorial]. IEEE Robot. Autom. Mag. 15(1), 104–119 (2008)CrossRefGoogle Scholar
  76. 76.
    Madhyastha, T.M., Reed, D.A.: Data sonification: do you see what I hear? IEEE Softw. 12(2), 45–56 (1995)CrossRefGoogle Scholar
  77. 77.
    Massie, T.H., Salisbury, J.K.: The phantom haptic interface: A device for probing virtual objects. In: Proceedings of the ASME Winter Annual Meeting, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, vol. 55, no. 1, pp. 295–302 (1994)Google Scholar
  78. 78.
    Mazza, R.: Introduction to Information Visualization. Springer, London (2009).  https://doi.org/10.1007/978-1-84800-219-7CrossRefGoogle Scholar
  79. 79.
    McGookin, D., Robertson, E., Brewster, S.: Clutching at straws: using tangible interaction to provide non-visual access to graphs. In: Proceedings of the ACM Conference on Human Factors in Computing Systems I, pp. 1715–1724. ACM Press (2010)Google Scholar
  80. 80.
    McGookin, D., Brewster, S.: MultiVis: Improving access to visualisations for visually impaired people. In: ACM Conference on Human Factors in Computing Systems: Extended Abstracts, pp. 267–270. ACM (2006)Google Scholar
  81. 81.
    McGurk, H., MacDonald, J.: Hearing lips and seeing voices. Nature 264(5588), 746 (1976)CrossRefGoogle Scholar
  82. 82.
    Miles, H.C.: Alternative representations of 3D-reconstructed heritage data. J. Comput. Cult. Herit. 9(1), 4:1–4:18 (2015)CrossRefGoogle Scholar
  83. 83.
    Miller, G.A.: The magical number seven, plus or minus two: some limits on our capacity for processing information. Psychol. Rev. 63(2), 81 (1956)CrossRefGoogle Scholar
  84. 84.
    Munzner, T.: Visualization Analysis and Design. CRC Press, Boca Raton (2014)CrossRefGoogle Scholar
  85. 85.
    Murray, A.M., Klatzky, R.L., Khosla, P.K.: Psychophysical characterization and testbed validation of a wearable vibrotactile glove for telemanipulation. Presence: Teleoperators Virtual Environ. 12(2), 156–182 (2003)CrossRefGoogle Scholar
  86. 86.
    Nakamoto, T., Yosihioka, M., Tanaka, Y., Kobayashi, K., Moriizumi, T., Ueyama, S., Yerazunis, W.: Colorimetric method for odor discrimination using dye-coated plate and multiLED sensor. Sens. Actuators B Chem. 116(1–2), 202–206 (2006)CrossRefGoogle Scholar
  87. 87.
    Nakamoto, T., Kinoshita, M., Murakami, K., Yossiri, A.: Demonstration of improved olfactory display using rapidly-switching solenoid valves. In: IEEE Virtual Reality Conference, pp. 301–302 (2009)Google Scholar
  88. 88.
    Neuhoff, J.G., Kramer, G., Wayand, J.: Sonification and the interaction of perceptual dimensions: can the data get lost in the map? (2000)Google Scholar
  89. 89.
    Panëels, S., Roberts, J.C.: Review of designs for haptic data visualization. IEEE Trans. Haptics 3(2), 119–137 (2010)CrossRefGoogle Scholar
  90. 90.
    Petermeijer, S.M., Abbink, D.A., Mulder, M., de Winter, J.C.F.: The effect of haptic support systems on driver performance: a literature survey. IEEE Trans. Haptics 8(4), 467–479 (2015)CrossRefGoogle Scholar
  91. 91.
    Petrie, H., et al.: TeDUB: a system for presenting and exploring technical drawings for blind people. In: Miesenberger, K., Klaus, J., Zagler, W. (eds.) ICCHP 2002. LNCS, vol. 2398, pp. 537–539. Springer, Heidelberg (2002).  https://doi.org/10.1007/3-540-45491-8_102CrossRefGoogle Scholar
  92. 92.
    Post, D.L., Greene, E.: Color name boundaries for equally bright stimuli on a CRT: Phase I. Soc. Inf. Disp.-Dig. Tech. Pap. 86, 70–73 (1986)Google Scholar
  93. 93.
    Purves, D., Augustine, G.J., Fitzpatrick, D., Hall, W.C., LaMantia, A.S., McNamara, J.O., Williams, S.M. (eds.): Neuroscience, 3rd edn. Sinauer Associates Inc., Sunderland (2004)Google Scholar
  94. 94.
    Ramloll, R., Yu, W., Brewster, S., Riedel, B., Burton, M., Dimigen, G.: Constructing sonified haptic line graphs for the blind student: first steps. In: Proceedings of the Fourth International ACM Conference on Assistive Technologies, pp. 17–25. ACM (2000)Google Scholar
  95. 95.
    Razzaque, S., Swapp, D., Slater, M., Whitton, M.C., Steed, A.: Redirected walking in place. In: Proceedings of the workshop on Virtual environments 2002, pp. 123–130. Eurographics Association (2002)Google Scholar
  96. 96.
    Reed, S., et al.: Shaping watersheds exhibit: an interactive, augmented reality sandbox for advancing earth science education. In: AGU Fall Meeting Abstracts, vol. 1, p. 01 (2014)Google Scholar
  97. 97.
    Rincon-Gonzalez, L., Warren, J.P., Meller, D.M., Tillery, S.H.: Haptic interaction of touch and proprioception: implications for neuroprosthetics. IEEE Trans. Neural Syst. Rehabil. Eng. 19(5), 490–500 (2011)CrossRefGoogle Scholar
  98. 98.
    Risset, J.C.: Pitch and rhythm paradoxes: Comments on "Auditory paradox based on fractal waveform" [J. Acoust. Soc. Am. 79, 186–189 (1986)]. Acoust. Soc. Am. J. 80, 961–962 (1986)CrossRefGoogle Scholar
  99. 99.
    Roberts, J.C., Franklin, K.: Haptic glyphs (hlyphs) - structured haptic objects for haptic visualization. In: First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics Conference, pp. 369–374 (2005)Google Scholar
  100. 100.
    Robinett, W.: Interactivity and individual viewpoint in shared virtual worlds: the big screen vs. networked personal displays. SIGGRAPH. Comput. Graph. 28(2), 127–130 (1994)CrossRefGoogle Scholar
  101. 101.
    Robinson, S., et al.: Emergeables: deformable displays for continuous eyes-free mobile interaction. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI 2016, , pp. 3793–3805. ACM (2016)Google Scholar
  102. 102.
    Rohn, H.: VANTED v2: a framework for systems biology applications. BMCSyst. Biol. 6(1), 139 (2012)Google Scholar
  103. 103.
    Salisbury, C., Gillespie, R.B., Tan, H., Barbagli, F., Salisbury, J.K.: Effects of haptic device attributes on vibration detection thresholds. In: Proceedings of the World Haptics 2009 - Third Joint EuroHaptics conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pp. 115–120. IEEE Computer Society (2009)Google Scholar
  104. 104.
    Saue, S.: A model for interaction in exploratory sonification displays. In: Proceedings of the International Conference on Auditory Display (2000)Google Scholar
  105. 105.
    Scaletti, C., Craig, A.B.: Using sound to extract meaning from complex data. In: Proceedings of SPIE 1459 (Extracting Meaning from Complex Data: Processing, Display, Interaction II), pp. 207–219 (1991)Google Scholar
  106. 106.
    Shams, L., Kamitani, Y., Shimojo, S.: Illusions: what you see is what you hear. Nature 408(6814), 788–788 (2000)CrossRefGoogle Scholar
  107. 107.
    Shepard, R.N.: Circularity in judgements of relative pitch. J. Acoust. Soc. Am. 36(12), 2346–2353 (1964)CrossRefGoogle Scholar
  108. 108.
    Shneiderman, B.: The eyes have it: a task by data type taxonomy for information visualizations. In: Proceedings of the IEEE Symposium on Visual Languages, pp. 336–343 (1996)Google Scholar
  109. 109.
    Shull, P.B., Damian, D.D.: Haptic wearables as sensory replacement, sensory augmentation and trainer - a review. J. NeuroEng. Rehabil. 12(1), 59 (2015)CrossRefGoogle Scholar
  110. 110.
  111. 111.
    Spirkovska, L.: Summary of Tactile User Interfaces Techniques and System. NASA Ames Research Center (2005)Google Scholar
  112. 112.
    Stroop, J.R.: Studies of interference in serial verbal reactions. J. Exp. Psychol. 18(6), 643–662 (1935)CrossRefGoogle Scholar
  113. 113.
    Takahashi, C., Watt, S.J.: Optimal visual-haptic integration with articulated tools. Exp. Brain Res. 235(5), 1361–1373 (2017)CrossRefGoogle Scholar
  114. 114.
    Tanaka, Y., Nakamoto, T., Moriizumi, T.: Study of highly sensitive smell sensing system using gas detector tube combined with optical sensor. Sens. Actuators B Chem. 119(1), 84–88 (2006)CrossRefGoogle Scholar
  115. 115.
    Tory, M., Moller, T.: Human factors in visualization research. IEEE Trans. Vis. Comput. Graph. 10(1), 72–84 (2004)CrossRefGoogle Scholar
  116. 116.
    Vidal-Verdu, F., Hafez, M.: Graphical tactile displays for visually-impaired people. IEEE Trans. Neural Syst. Rehabil. Eng. 15(1), 119–130 (2007)CrossRefGoogle Scholar
  117. 117.
    Ware, C.: Information Visualization: Perception for Design, 3rd edn. Morgan Kaufmann Publishers Inc., San Francisco (2013)Google Scholar
  118. 118.
    Wertheimer, M.: Untersuchungen zur lehre von der gestalt. Psychol. Res. 1(1), 47–58 (1922)CrossRefGoogle Scholar
  119. 119.
    Wilson, D.A., Stevenson, R.J.: Learning to smell: olfactory perception from neurobiology to behavior. JHU Press, Baltimore (2006)Google Scholar
  120. 120.
    Winfield, L., Glassmire, J., Colgate, J.E., Peshkin, M.: T-pad: Tactile pattern display through variable friction reduction. In: Second Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (WHC 2007), pp. 421–426. IEEE (2007)Google Scholar
  121. 121.
    Xi, H., Kelley, A.: Sonification of time-series data sets. Bull. Am. Phys. Soc. 60 (2015)Google Scholar
  122. 122.
    Yim, S., Jeon, S., Choi, S.: Data-driven haptic modeling and rendering of viscoelastic and frictional responses of deformable objects. IEEE Trans. Haptics 9(4), 548–559 (2016)CrossRefGoogle Scholar
  123. 123.
    Zarzo, M., Stanton, D.T.: Understanding the underlying dimensions in perfumers’ odor perception space as a basis for developing meaningful odor maps. Atten. Percept. Psychophys. 71(2), 225–247 (2009)CrossRefGoogle Scholar
  124. 124.
    Zhao, H., Plaisant, C., Shneiderman, B., Lazar, J.: Data sonification for users with visual impairment: a case study with georeferenced data. ACM Trans. Comput.-Hum. Interact. 15(1), 4 (2008)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Jon McCormack
    • 1
  • Jonathan C. Roberts
    • 2
  • Benjamin Bach
    • 3
  • Carla Dal Sasso Freitas
    • 4
  • Takayuki Itoh
    • 5
  • Christophe Hurter
    • 6
  • Kim Marriott
    • 1
  1. 1.Monash UniversityMelbourneAustralia
  2. 2.Bangor UniversityBangorUK
  3. 3.University of EdinburghEdinburghUK
  4. 4.Federal University of Rio Grande do SulPorto AlegreBrazil
  5. 5.Ochanomizu UniversityTokyoJapan
  6. 6.Ecole Nationale de l’Aviation Civile (ENAC)ToulouseFrance

Personalised recommendations