Interactive Exploration of Three-Dimensional Scientific Visualizations on Large Display Surfaces

Chapter

Abstract

The chapter surveys the different approaches investigated to interact with scientific visualizations on large surfaces such as tables and walls. The chapter particularly does not focus on VR-based interaction or tangible input but on those interaction techniques where the input is provided on the surface itself or where it is focused on the surface. In particular, tactile interaction techniques are covered and the challenges of gestural input as well as of combining touch input with stereoscopic rendering are discussed. Where possible, connections to collaborative interaction scenarios are pointed out, even though most publications to date focus on single-user interaction.

References

  1. 1.
    Abeele S, Bock O (2001) Mechanisms for sensorimotor adaptation to rotated visual input. Exp Brain Res 139(2):248–253. doi:10.1007/s002210100768 CrossRefGoogle Scholar
  2. 2.
    Abeele S, Bock O (2001) Sensorimotor adaptation to rotated visual input: different mechanisms for small versus large rotations. Exp Brain Res 140(4):407–410. doi:10.1007/s002210100846 CrossRefGoogle Scholar
  3. 3.
    Akers D (2006) CINCH: a cooperatively designed marking interface for 3D pathway selection. In: Proceedings of UIST. ACM, New York, pp 33–42. doi:10.1145/1166253.1166260
  4. 4.
    Argelaguet F, Andujar C (2009) Efficient 3D pointing selection in cluttered virtual environments. IEEE Comput Graph Appl 29(6):34–43. doi:10.1109/MCG.2009.117 CrossRefGoogle Scholar
  5. 5.
    Argelaguet F, Andujar C (2013) A survey of 3D object selection techniques for virtual environments. Comput Graph 37(3):121–136. doi:10.1016/j.cag.2012.12.003 CrossRefGoogle Scholar
  6. 6.
    Au OKC, Tai CL, Fu, H (2012) Multitouch gestures for constrained transformation of 3D objects. Comput Graph Forum, 31(2pt3): 651–660. doi:10.1111/j.1467-8659.2012.03044.x
  7. 7.
    Banic A (2014) Selection classification for interaction with immersive volumetric visualizations. In: Human interface and the management of information: information and knowledge design and evaluation. LNCS, vol 8521. Springer, Cham/Heidelberg, pp 10–21. doi:10.1007/978-3-319-07731-4_2
  8. 8.
    Bedford F (1993) Perceptual learning. In: Medin DL (ed) The psychology of learning and motivation, vol. 30. Academic Press, Inc., New York, pp 1–60. doi:10.1016/S0079-7421(08)60293-5
  9. 9.
    Besançon L, Issartel P, Ammi M, Isenberg T (2016) Usability comparison of mouse, touch and tangible inputs for 3D data manipulation. Tech. Rep. arXiv:1603.08735
  10. 10.
    Bock O (2013) Basic principles of sensorimotor adaptation to different distortions with different effectors and movement types: a review and synthesis of behavioral findings. Front Hum Neurosci, 7(81) (2013). doi:10.3389/fnhum.2013.00081
  11. 11.
    Bogorin MA, Luderschmidt J, Dörner R, Geiger C (2009) ViNet—Interaction with information visualizations in VR applications via multi-touch and tangible user interfaces. In: Proceedings 6th GI Workshop “Virtuelle und Erweiterte Realität”. Shaker-Verlag, Aachen, pp 211–222Google Scholar
  12. 12.
    Bowman DA, Kruijff E, LaViola JJ Jr, Poupyrev I (2005) 3D User interfaces: theory and practice. Addison-Wesley, BostonGoogle Scholar
  13. 13.
    Bruder G, Steinicke F (2013) 2.5D touch interaction on stereoscopic tabletop surfaces. In: Proceedings ISIS3D, pp 1–4Google Scholar
  14. 14.
    Bruder G, Steinicke F, Stürzlinger W (2013) To touch or not to touch? Comparing 2D touch and 3D mid-air interaction on stereoscopic tabletop surfaces. In: Proceedings SUI. ACM, New York, pp 9–16. doi:10.1145/2491367.2491369
  15. 15.
    Butkiewicz T, Ware C (2011) Exploratory analysis of ocean flow models with stereoscopic multi-touch. In: IEEE Vis PostersGoogle Scholar
  16. 16.
    Butkiewicz T, Ware C (2011) Multi-touch 3D exploratory analysis of ocean flow models. In: Proceedings OCEANS. IEEE, Los Alamitos, pp 746–755Google Scholar
  17. 17.
    Buxton B (2007) Multi-touch systems that I have known and loved. http://www.billbuxton.com/multitouchOverview.html. Updated last on June 12, 2014, visited in March 2016
  18. 18.
    Card SK, Mackinlay JD, Shneiderman B (1999) Readings in information visualization: using vision to think. Morgan Kaufmann, San FranciscoGoogle Scholar
  19. 19.
    Chan LW, Kao HS, Chen MY, Lee MS, Hsu J, Hung YP (2010) Touching the void: direct-touch interaction for intangible displays. In: Proceedings CHI, ACM, New York, pp 2625–2634. doi:10.1145/1753326.1753725
  20. 20.
    Coffey D, Korsakov F, Keefe DF (2010) Low cost VR meets low cost multi-touch. In: Proceedings ISVC, vol 2. Springer, Berlin, Heidelberg, pp 351–360. doi:10.1007/978-3-642-17274-8_35
  21. 21.
    Coffey D, Malbraaten N, Le T, Borazjani I, Sotiropoulos F, Erdman AG, Keefe DF (2012) Interactive Slice WIM: navigating and interrogating volume datasets using a multi-surface, multi-touch VR interface. IEEE TVCG 18(10):1614–1626. doi:10.1109/TVCG.2011.283 Google Scholar
  22. 22.
    Coffey D, Malbraaten N, Le T, Borazjani I, Sotiropoulos F, Keefe DF (2011) Slice WIM: a multi-surface, multi-touch interface for overview+detail exploration of volume datasets in virtual reality. In: Proceedings I3D. ACM, New York, pp 191–198. doi:10.1145/1944745.1944777
  23. 23.
    Cohé A, Dècle F, Hachet M (2011) tBox: a 3D transformation widget designed for touch-screens. In: Proceedings CHI. ACM, New York, pp 3005–3008. doi:10.1145/1978942.1979387
  24. 24.
    Colley A, Häkkilä J, Schöning J, Daiber F, Steinicke F, Krüger A (2015) Touch the 3rd dimension! Understanding stereoscopic 3D touchscreen interaction. In: Computer-Human Interaction: Cognitive Effects of Spatial Interaction, Learning, and Ability (extended papers of OzCHI 2013). LNCS, vol. 8433. Springer, Berlin/Heidelberg, pp 47–67. doi:10.1007/978-3-319-16940-8_3
  25. 25.
    Cunningham HA (1989) Aiming error under transformed spatial mappings suggests a structure for visual-motor maps. J Exp Psychol Hum Percept Perform 15(3):493–506. doi:10.1037/0096-1523.15.3.493 CrossRefGoogle Scholar
  26. 26.
    Cutler LD, Fröhlich B, Hanrahan P (1997) Two-handed direct manipulation on the responsive workbench. In: Proceedings SI3D. ACM, New York, pp 107–114. doi:10.1145/253284.253315
  27. 27.
    Daiber F (2011) Interaction with stereoscopic data on and above multi-touch surfaces. In: ITS Doctoral Colloquium. ACM, New York, pp 2:1–2:4. doi:10.1145/2076354.2076428
  28. 28.
    van Dam A, Forsberg AS, Laidlaw DH, LaViola JJ, Simpson RM (2000) Immersive VR for scientific visualization: a progress report. IEEE Comput Graph Appl 20(6):26–52. doi:10.1109/38.888006 CrossRefGoogle Scholar
  29. 29.
    Fu CW, Goh WB, Ng JA (2010) Multi-touch techniques for exploring large-scale 3D astrophysical simulations. In: Proceedings CHI. ACM, New York, pp 2213–2222. doi:10.1145/1753326.1753661
  30. 30.
    Giesler A, Valkov D, Hinrichs K (2014) Void shadows: multi-touch interaction with stereoscopic objects on the tabletop. In: Proceedings SUI. ACM, New York, NY, USA, pp 104–112. doi:10.1145/2659766.2659779
  31. 31.
    Hachet M, Bossavit B, Cohé A, de la Rivière JB (2011) Toucheo: multitouch and stereo combined in a seamless workspace. In: Proceedings UIST. ACM, New York, pp 587–592. doi:10.1145/2047196.2047273
  32. 32.
    Hachet M, de la Rivière JB, Laviole J, Cohé A, Cursan S (2013) Touch-based interfaces for interacting with 3D content in public exhibitions. IEEE Comput Graph Appl 33(2):80–85. doi:10.1109/MCG.2013.34 CrossRefGoogle Scholar
  33. 33.
    Hancock M, ten Cate T, Carpendale S (2009) Sticky tools: full 6DOF force-based interaction for multi-touch tables. In: Proceedings ITS. ACM, New York, pp 145–152. doi:10.1145/1731903.1731930
  34. 34.
    Hancock M, ten Cate T, Carpendale S, Isenberg T (2010) Supporting sandtray therapy on an interactive tabletop. In: Proceedings CHI. ACM, New York, pp 2133–2142. doi:10.1145/1753326.1753651
  35. 35.
    Hancock MS, Carpendale S, Vernier FD, Wigdor D, Shen C (2006) Rotation and translation mechanisms for tabletop interaction. In: Proceedings Tabletop. IEEE Computer Society, Los Alamitos, pp. 79–88. doi:10.1109/TABLETOP.2006.26
  36. 36.
    Hand C (1997) A survey of 3D interaction techniques. Comput Graph Forum 16(5):269–281. doi:10.1111/1467-8659.00194 CrossRefGoogle Scholar
  37. 37.
    Hornecker E, Marshall P, Dalton NS, Rogers Y (2008) Collaboration and interference: awareness with mice or touch input. In: Proceedings CSCW. ACM, New York, pp 167–176. doi:10.1145/1460563.1460589
  38. 38.
    Isenberg P, Isenberg T (2013) Visualization on interactive surfaces: a research overview. i-com 12(3), 10–17. doi:10.1524/icom.2013.0020
  39. 39.
    Isenberg P, Isenberg T, Hesselmann T, Lee B, von Zadow U, Tang A (2013) Data visualization on interactive surfaces: a research agenda. IEEE Comput Graph Appl 33(2):16–24. doi:10.1109/MCG.2013.24 CrossRefGoogle Scholar
  40. 40.
    Isenberg T (2011) Position paper: touch interaction in scientific visualization. In: Proceedings DEXIS, pp 24–27Google Scholar
  41. 41.
    Isenberg T (2014) An interaction continuum for visualization. In: Proceedings IEEE VIS Workshop on Death of the Desktop: Envisioning Visualization without Desktop ComputingGoogle Scholar
  42. 42.
    Isenberg T, Hancock M (2012) Gestures vs. postures: ‘Gestural’ touch interaction in 3D environments. In: Proceedings 3DCHI, pp 53–61Google Scholar
  43. 43.
    Jackson B, Coffey D, Keefe DF(2012) Force brushes: progressive data-driven haptic selection and filtering for multi-variate flow visualizations. In: Short Paper Proceedings EuroVis. Eurographics Association, Goslar, Germany, pp 7–11. doi:10.2312/PE/EuroVisShort/EuroVisShort2012/007-011
  44. 44.
    Jackson B, Lau TY, Schroeder D, Toussaint KC Jr, Keefe DF (2013) A lightweight tangible 3D interface for interactive visualization of thin fiber structures. IEEE Trans Vis Comput Graph 19(12):2802–2809. doi:10.1109/TVCG.2013.121 CrossRefGoogle Scholar
  45. 45.
    Jackson B, Schroeder D, Keefe DF (2012) Nailing down multi-touch: anchored above the surface interaction for 3D modeling and navigation. In: Proceedings Graphics Interface. Canadian Information Processing Society, Toronto, pp 181–184. doi:10.20380/GI2012.23
  46. 46.
    Jankowski J, Hachet M (2013) A survey of interaction techniques for interactive 3D environments. In: Eurographics State of the Art Reports. Eurographics Association, Goslar, Germany, pp 65–93. doi:10.2312/conf/EG2013/stars/065-093
  47. 47.
    Jansen Y, Dragicevic P (2013) An interaction model for visualizations beyond the desktop. IEEE Trans Vis Comput Graph 19(12):2396–2405. doi:10.1109/TVCG.2013.134 CrossRefGoogle Scholar
  48. 48.
    Jansen Y, Dragicevic P, Fekete JD (2012) Tangible remote controllers for wall-size displays. In: Proceedings CHI. ACM, New York, pp 2865–2874. doi:10.1145/2207676.2208691
  49. 49.
    Jansen Y, Dragicevic P, Fekete JD (2013) Evaluating the efficiency of physical visualizations. In: Proceedings CHI. ACM, New York, pp 2593–2602. doi:10.1145/2470654.2481359
  50. 50.
    Jönsson D, Falk M, Ynnerman A (2016) Intuitive exploration of volumetric data using dynamic galleries. IEEE Trans Vis Comput Graph 22(1):896–905. doi:10.1109/TVCG.2015.2467294 CrossRefGoogle Scholar
  51. 51.
    Keefe DF (2010) Integrating visualization and interaction research to improve scientific workflows. IEEE Comput Graph Appl 30(2):8–13. doi:10.1109/MCG.2010.30 CrossRefGoogle Scholar
  52. 52.
    Keefe DF, Isenberg T (2013) Reimagining the scientific visualization interaction paradigm. IEEE Comput 46(5):51–57. doi:10.1109/MC.2013.178 CrossRefGoogle Scholar
  53. 53.
    Keefe DF, Zeleznik RC, Laidlaw DH (2008) Tech-note: dynamic dragging for input of 3D trajectories. In: Proceedings 3DUI. IEEE Computer Society, Los Alamitos, pp 51–54. doi:10.1109/3DUI.2008.4476591
  54. 54.
    Kin K, Agrawala M, DeRose T (2009) Determining the benefits of direct-touch, bimanual, and multifinger input on a multitouch workstation. In: Proceedings Graphics Interface. CIPS, Toronto, pp 119–124. doi:10.20380/GI2009.16
  55. 55.
    Klein T, Guéniat F, Pastur L, Vernier F, Isenberg T (2012) A design study of direct-touch interaction for exploratory 3D scientific visualization. Comput Graph Forum 31(3):1225–1234. doi:10.1111/j.1467-8659.2012.03115.x CrossRefGoogle Scholar
  56. 56.
    Knoedel S, Hachet M (2011) Multi-touch RST in 2D and 3D spaces: studying the impact of directness on user performance. In: Proceedings 3DUI. IEEE Computer Society, Los Alamitos, pp 75–78. doi:10.1109/3DUI.2011.5759220
  57. 57.
    Krüger W, Fröhlich B (1994) The responsive workbench. IEEE Comput Graph Appl 14(3):12–15. doi:10.1109/38.279036 CrossRefGoogle Scholar
  58. 58.
    Kruszyński KJ, van Liere R (2009) Tangible props for scientific visualization: concept, requirements, application. Virtual Real 13(4):235–244. doi:10.1007/s10055-009-0126-1 CrossRefGoogle Scholar
  59. 59.
    Liu J, Au OKC, Fu H, Tai CL (2012) Two-finger gestures for 6DOF manipulation of 3D objects. Comput Graph Forum 31(7):2047–2055. doi:10.1111/j.1467-8659.2012.03197.x CrossRefGoogle Scholar
  60. 60.
    López D, Oehlberg L, Doger C, Isenberg T (2016) Towards an understanding of mobile touch navigation in a stereoscopic viewing environment for 3D data exploration. IEEE Trans Vis Comput Graph 22(5):1616–1629. doi:10.1109/TVCG.2015.2440233 CrossRefGoogle Scholar
  61. 61.
    Lundström C, Rydell T, Forsell C, Persson A, Ynnerman A (2011) Multi-touch table system for medical visualization: application to orthopedic surgery planning. IEEE Trans Vis Comput Graph, 17(12). doi:10.1109/TVCG.2011.224
  62. 62.
    Marton F, Rodriguez MB, Bettio F, Agus M, Villanueva AJ, Gobbetti E (2014) IsoCam: interactive visual exploration of massive cultural heritage models on large projection setups. J Comput Cult Herit, 7(2), 12:1–12:24. doi:10.1145/2611519
  63. 63.
    Moscovich T, Hughes JF (2008) Indirect mappings of multi-touch input using one and two hands. In: Proceedings CHI. ACM, New York, pp 1275–1284. doi:10.1145/1357054.1357254
  64. 64.
    Novotný M, Lacko J, Samuelčík M (2013) Applications of multi-touch augmented reality system in education and presentation of virtual heritage. Proc Comput Sci 25:231–235. doi:10.1016/j.procs.2013.11.028 CrossRefGoogle Scholar
  65. 65.
    Owada S, Nielsen F, Igarashi T (2005) Volume catcher. In: Proceedings I3D. ACM, New York, pp 111–116. doi:10.1145/1053427.1053445
  66. 66.
    Reisman JL, Davidson PL, Han JY (2009) A screen-space formulation for 2D and 3D direct manipulation. In: Proceedings UIST. ACM, New York, pp 69–78. doi:10.1145/1622176.1622190
  67. 67.
    Robles-De-La-Torre G (2006) The importance of the sense of touch in virtual and real environments. IEEE MultiMed 13(3):24–30. doi:10.1109/MMUL.2006.69 CrossRefGoogle Scholar
  68. 68.
    Rogers Y, Lindley S (2004) Collaborating around vertical and horizontal large interactive displays: which way is best? Interact Comput 16(6):1133–1152. doi:10.1016/j.intcom.2004.07.008 CrossRefGoogle Scholar
  69. 69.
    Schmalstieg D, Encarnação LM, Szalavári Z (1999) Using transparent props for interaction with the virtual table. In: Proceedings I3D. ACM, New York, pp. 147–153. doi:10.1145/300523.300542
  70. 70.
    Shan G, Xie M, Li F, Gao Y, Chi X (2014) Interactive visual exploration of halos in large-scale cosmology simulation. J Vis 17(3):145–156. doi:10.1007/s12650-014-0206-5 CrossRefGoogle Scholar
  71. 71.
    Song P, Goh WB, Fu CW, Meng Q, Heng PA (2011) WYSIWYF: exploring and annotating volume data with a tangible handheld device. In: Proceedings CHI. ACM, New York, pp 1333–1342. doi:10.1145/1978942.1979140
  72. 72.
    Steinicke F, Hinrichs KH, Schöning J, Krüger A (2008) Multi-touching 3D data: towards direct interaction in stereoscopic display environments coupled with mobile devices. In: Proceedings AVI Workshop on Designing Multi-Touch Interaction Techniques for Coupled Public and Private Displays, pp 46–49Google Scholar
  73. 73.
    Sultanum N, Sharlin E, Sousa MC, Miranda-Filho DN, Eastick R (2010) Touching the depths: introducing tabletop interaction to reservoir engineering. In: Proceedings ITS. ACM, New York, pp 105–108. doi:10.1145/1936652.1936671
  74. 74.
    Sultanum N, Somanath S, Sharlin E, Sousa MC (2011) Point it, split it, peel it, view it: techniques for interactive reservoir visualization on tabletops. In: Proceedings ITS. ACM, New York, pp 192–201. doi:10.1145/2076354.2076390
  75. 75.
    Sultanum N, Vital Brazil E, Costa Sousa M (2013) Navigating and annotating 3D geological outcrops through multi-touch interaction. In: Proceedings ITS. ACM, New York, pp 345–348. doi:10.1145/2512349.2512396
  76. 76.
    Sundén E, Bock A, Jönsson D, Ynnerman A, Ropinski T (2014) Interaction techniques as a communication channel when presenting 3D visualizations. In: Proceedings 3DVis. IEEE, Los Alamitos, pp 61–65. doi:10.1109/3DVis.2014.7160102
  77. 77.
    Tan DS, Gergle D, Scupelli P, Pausch R (2006) Physically large displays improve performance on spatial tasks. ACM Trans Comput Hum Interact 13(1):71–99. doi:10.1145/1143518.1143521 CrossRefGoogle Scholar
  78. 78.
    Teather RJ, Stuerzlinger W (2011) Pointing at 3D targets in a stereo head-tracked virtual environment. In: Proceedings 3DUI. pp 87–94. doi:10.1109/3DUI.2011.5759222
  79. 79.
    Telea AC (2015) Data visualization: principles and practice, 2nd edn. A K Peters/CRC PressGoogle Scholar
  80. 80.
    Tong X, Chen CM, Shen HW, Wong PC (2015) Interactive streamline exploration and manipulation using deformation. In: Proceedings PacificVis. IEEE, Los Alamitos, pp 1–8. doi:10.1109/PACIFICVIS.2015.7156349
  81. 81.
    Valkov D, Giesler A, Hinrichs KH (2014) Imperceptible depth shifts for touch interaction with stereoscopic objects. In: Proceedings CHI. ACM, New York, pp 227–236. doi:10.1145/2556288.2557134
  82. 82.
    Valkov D, Steinicke F, Bruder G, Hinrichs K (2011) 2D touching of 3D stereoscopic objects. In: Proceedings CHI. ACM, New York, pp 1353–1362. doi:10.1145/1978942.1979142
  83. 83.
    Valkov D, Steinicke F, Bruder G, Hinrichs KH, Schöning J, Daiber F, Krüger A (2010) Touching floating objects in projection-based virtual reality environments. In: Proceedings EGVE/EuroVR/VEC. Eurographics Association, Goslar, Germany, pp 17–24. doi:10.2312/EGVE/JVRC10/017-024
  84. 84.
    Ware C, Arsenault R (2004) Frames of reference in virtual object rotation. In: Proceedings APGV. ACM, New York, pp 135–141. doi:10.1145/1012551.1012576
  85. 85.
    Welch RB (1978) Perceptual modification: adapting to altered sensory environments. Academic Press, New York. doi:10.1016/B978-0-12-741850-6.50002-1
  86. 86.
    Wiebel A, Vos FM, Foerster D, Hege HC (2012) WYSIWYP: what you see is what you pick. IEEE Trans Vis Comput Graph 18(12):2236–2244. doi:10.1109/TVCG.2012.292 CrossRefGoogle Scholar
  87. 87.
    Wills GJ (1996) Selection: 524,288 ways to say “This is interesting”. In: Proceedings InfoVis. IEEE Computer Society, Los Alamitos, pp 54–60. doi:10.1109/INFVIS.1996.559216
  88. 88.
    Yu L, Efstathiou K, Isenberg P, Isenberg T (2012) Efficient structure-aware selection techniques for 3D point cloud visualizations with 2DOF input. IEEE Trans Vis Comput Graph 18(12):2245–2254. doi:10.1109/TVCG.2012.217 CrossRefGoogle Scholar
  89. 89.
    Yu L, Efstathiou K, Isenberg P, Isenberg T (2016) CAST: Effective and efficient user interaction for context-aware selection in 3D particle clouds. IEEE Trans Vis Comput Graph 22(1):886–895. doi:10.1109/TVCG.2015.2467202 CrossRefGoogle Scholar
  90. 90.
    Yu L, Svetachov P, Isenberg P, Everts MH, Isenberg T (2010) FI3D: Direct-touch interaction for the exploration of 3D scientific visualization spaces. IEEE Trans Vis Comput Graph 16(6):1613–1622. doi:10.1109/TVCG.2010.157 CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.InriaSaclayFrance

Personalised recommendations