Advertisement

On-Body Tangible Interaction: Using the Body to Support Tangible Manipulations for Immersive Environments

  • Houssem SaidiEmail author
  • Marcos Serrano
  • Pourang Irani
  • Christophe Hurter
  • Emmanuel Dubois
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11749)

Abstract

Recent technological advances in immersive devices open up many opportunities for users to visualize data in their environments. However, current interactive solutions fail at providing a convenient approach to manipulate such complex immersive visualizations. In this article, we present a new approach to interact in these environments, that we call On-Body Tangible interaction (OBT): using the body to physically support the manipulation of an input device. The use of the body to support the interaction allows the user to move in his environment and avoid the inherent fatigue of mid-air interactions. In this paper, we explore the use of a rolling device, which fits well on-body interaction thanks to its form factor and offers enough degrees of freedom (DoF) for data manipulation. We first propose a new design space for OBT interactions, and specifically on the forearm. Then we validate the feasibility of such an approach through an experiment aimed at establishing the range, stability and comfort of gestures performed with the device on the forearm. Our results reveal that on-body tangible interaction on the forearm is stable and offers multiple DoFs with little fatigue. We illustrate the benefits of our approach through sample applications where OBT interactions are used to select and execute space-time cube operations.

Keywords

Immersive environment On-body interaction Tangible interaction 

Notes

Acknowledgments

This work is partially funded by the AP2 project grant AP2 ANR-15-CE23-0001.

References

  1. 1.
    Bach, B., Sicat, R., Beyer, J., Cordeil, M., Pfister, H.: The Hologram in my hand: how effective is interactive exploration of 3D visualizations in immersive tangible augmented reality? IEEE Trans. Vis. Comput. Graph. (TVCG) 24, 457–467 (2018)CrossRefGoogle Scholar
  2. 2.
    Bach, B., Dragicevic, P., Archambault, D., Hurter, C., Carpendale, S.: A descriptive framework for temporal data visualizations based on generalized space-time cubes. Comput. Graph. Forum 36(6), 36–61 (2017).  https://doi.org/10.1111/cgf.12804CrossRefGoogle Scholar
  3. 3.
    Balakrishnan, R., Baudel, T., Kurtenbach, G., Fitzmaurice, G.: The Rockin’Mouse: integral 3D manipulation on a plane. In: Proceedings of the ACM SIGCHI Conference on Human factors in Computing Systems, pp. 311–318, Atlanta, Georgia, USA, 22–27 March 1997.  https://doi.org/10.1145/258549.258778
  4. 4.
    Beaudouin-Lafon, M.: Instrumental interaction: an interaction model for designing post-WIMP user interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2000), pp. 446–453. ACM, New York (2000).  https://doi.org/10.1145/332040.332473
  5. 5.
    Benko, H., Wilson, A.D.: Multi-point interactions with immersive omnidirectional visualizations in a dome. In: ACM International Conference on Interactive Tabletops and Surfaces (ITS 2010), pp. 19–28. ACM, New York (2010).  https://doi.org/10.1145/1936652.1936657
  6. 6.
    Benko, H., Izadi, S., Wilson, A.D., Cao, X., Rosenfeld, D., Hinckley, K.: Design and evaluation of interaction models for multi-touch mice. In: Proceedings of Graphics Interface 2010, 31 May–02 June 2010, Ottawa, Ontario, Canada (2010)Google Scholar
  7. 7.
    Bergé, L.-P., Serrano, M., Perelman, G., Dubois, E.: Exploring smartphone-based interaction with overview+detail interfaces on 3D public displays. In: Proceedings of the 16th International Conference on Human-Computer Interaction with Mobile Devices & Services (MobileHCI 2014), pp. 125–134. ACM, New York (2014).  https://doi.org/10.1145/2628363.2628374
  8. 8.
    Bergé, L.-P., Dubois, E., Raynal, M.: Design and evaluation of an “Around the SmartPhone” technique for 3D manipulations on distant display. In: Proceedings of the 3rd ACM Symposium on Spatial User Interaction (SUI 2015), pp. 69–78. ACM, New York (2015).  https://doi.org/10.1145/2788940.2788941
  9. 9.
    Bergstrom-Lehtovirta, J., Hornbæk, K., Boring, S.: It’s a wrap: mapping on-skin input to off-skin displays. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI 2018), 11 p. ACM, New York, Paper 564 (2018).  https://doi.org/10.1145/3173574.3174138
  10. 10.
    Besançon, L., Issartel, P., Ammi, M., Isenberg, T.: Mouse, tactile, and tangible input for 3D manipulation. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI 2017), pp. 4727–4740. ACM, New York (2017).  https://doi.org/10.1145/3025453.3025863
  11. 11.
    Borg, G.: Borg’s Perceived Exertion and Pain Scales, p. 104. Human Kinetics, Champaign (1998). viiiGoogle Scholar
  12. 12.
    Burgess, R., et al.: Selection of large-scale 3D point cloud data using gesture recognition. In: Camarinha-Matos, L.M., Baldissera, T.A., Di Orio, G., Marques, F. (eds.) DoCEIS 2015. IAICT, vol. 450, pp. 188–195. Springer, Cham (2015).  https://doi.org/10.1007/978-3-319-16766-4_20CrossRefGoogle Scholar
  13. 13.
    Clarke, S., Dass, N., Chau, D.H.(Polo).: Naturalmotion: exploring gesture controls for visualizing time-evolving graphs. In: Proceedings of IEEE VIS (Poster Session) (2016)Google Scholar
  14. 14.
    Coffey, D., et al.: Interactive slice WIM: navigating and interrogating volume datasets using a multi-surface, multi-touch VR interface. IEEE Trans. Vis. Comput. Graph. 18(10), 1614–1626 (2012).  https://doi.org/10.1109/TVCG.2011.283CrossRefGoogle Scholar
  15. 15.
    Cordeil, M., Bach, B., Li, Y., Wilson, E., Dwyer, T.: A design space for spatio-data coordination: tangible interaction devices for immersive information visualisation. In: Proceedings of the 10th IEEE Pacific Visualization Symposium (PacificVis) (2017)Google Scholar
  16. 16.
    Cordeil, M., Dwyer, T., Hurter, C.: Immersive solutions for future air traffic control and management. In: Proceedings of the 2016 ACM Companion on Interactive Surfaces and Spaces (ISS Companion 2016), pp. 25–31. ACM, New York (2016).  https://doi.org/10.1145/3009939.3009944
  17. 17.
    Dobbelstein, D., Hock, P., Rukzio, E.: Belt: an unobtrusive touch input device for head-worn displays. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI 2015), pp. 2135–2138. ACM, New York (2015).  https://doi.org/10.1145/2702123.2702450
  18. 18.
    Giannopoulos, I., Komninos, A., Garofalakis, J.: Natural interaction with large map interfaces in VR. In: Proceedings of the 21st Pan-Hellenic Conference on Informatics (PCI 2017), Article no. 56, 6 p. ACM, New York (2017).  https://doi.org/10.1145/3139367.3139424
  19. 19.
    Fruchard, B., Lecolinet, E., Chapuis, O.: Impact of semantic aids on command memorization for On-Body interaction and directional gestures. In: Proceedings of the International Working Conference on Advanced Visual Interfaces, AVI 2018 - International Conference on Advanced Visual Interfaces, Castiglione della Pescaia, Grosseto, Italy, May 2018. ACM, AVI 2018 (2018)Google Scholar
  20. 20.
    Hancock, M., Carpendale, S., Cockburn, A.: Shallow-depth 3D interaction: design and evaluation of one-, two- and three-touch techniques. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems—CHI 2007, pp. 1147–1156. ACM Press (2007)Google Scholar
  21. 21.
    Harrison, C., Tan, D., Morris, D.: Skinput: appropriating the body as an input surface. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2010), pp. 453–462. ACM, New York (2010).  https://doi.org/10.1145/1753326.1753394
  22. 22.
    Hinckley, K., Wigdor, D., Input technologies and techniques. In: The HCI Handbook, 3rd edn. Taylor & Francis. Chap. 9Google Scholar
  23. 23.
    Hinckley, K., Sinclair, M., Hanson, E., Szeliski, R., Conway, M.: The VideoMouse: a camera-based multi-degree-of-freedom input device. In: Proceedings of the 12th annual ACM Symposium on User Interface Software and Technology, pp. 103–112, Asheville, North Carolina, USA, 07–10 November 1999.  https://doi.org/10.1145/320719.322591]
  24. 24.
    Hurter, C., Tissoires, B., Conversy, S.: FromDaDy: spreading aircraft trajectories across views to support iterative queries. IEEE Trans. Vis. Comput. Graph. 15(6), 1017–1024 (2009).  https://doi.org/10.1109/tvcg.2009.145CrossRefGoogle Scholar
  25. 25.
    Issartel, P., Guéniat, F., Ammi, M.: A portable interface for tangible exploration of volumetric data. In: Proceedings VRST, pp. 209–210. ACM, Edinburgh, Scotland (2014).  https://doi.org/10.1145/2671015.2671130. ISBN 978-1-4503-3253-8
  26. 26.
    Jackson, B., et al.: A lightweight tangible 3D interface for interactive visualization of thin fiber structures. IEEE Trans. Vis. Comput. Graph. 19(12), 2802–2809 (2013). http://doi.org/10.1109/TVCG.2013.121. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6651934CrossRefGoogle Scholar
  27. 27.
    Jansen, Y., Dragicevic, P., Fekete, J.-D.: Tangible remote controllers for wall-size displays. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2012), pp. 2865–2874. ACM, New York (2012). http://dx.doi.org/10.1145/2207676.2208691
  28. 28.
    Karrer, T., Wittenhagen, M., Lichtschlag, L., Heller, F., Borchers, J.: Pinstripe: eyes-free continuous input on interactive clothing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2011), pp. 1313–1322. ACM, New York (2011).  https://doi.org/10.1145/1978942.1979137
  29. 29.
    Kim, J.-S., Gračanin, D., Matković, K., Quek, F.: Finger walking in place (FWIP): a traveling technique in virtual environments. In: Butz, A., Fisher, B., Krüger, A., Olivier, P., Christie, M. (eds.) SG 2008. LNCS, vol. 5166, pp. 58–69. Springer, Heidelberg (2008).  https://doi.org/10.1007/978-3-540-85412-8_6CrossRefGoogle Scholar
  30. 30.
    Klein, T., Guéniat, F., Pastur, L., Vernier, F., Isenberg, T.: A design study of direct-touch interaction for exploratory 3D scientific visualization. Comput. Graph. Forum 31, 1225–1234 (2012)CrossRefGoogle Scholar
  31. 31.
    Lopez, D., Oehlberg, L., Doger, C., Isenberg, T.: Towards an understanding of mobile touch navigation in a stereoscopic viewing environment for 3D data exploration. IEEE Trans. Vis. Comput. Graph. 22(5), 1616–1629 (2016)CrossRefGoogle Scholar
  32. 32.
    Lundström, C., et al.: Multi-touch table system for medical visualization: application to orthopedic surgery planning. IEEE Trans. Vis. Comput. Graph. 17(12), (2011).  https://doi.org/10.1109/TVCG.2011.224CrossRefGoogle Scholar
  33. 33.
    Milgram, P., Colquhoun, H.: A taxonomy of real and virtual world display integration. Mixed Reality Merg. Real Virtual Worlds 1, 1–26 (1999)Google Scholar
  34. 34.
    Miranda, B.P., Carneiro, N.J.S., dos Santos, C.G.R., de Freitas, A.A., Magalhães, J., Meiguins, B.S., et al.: Categorizing issues in mid-air InfoVis interaction. In: 2016 20th International Conference Information Visualisation (IV), pp. 242–246. IEEE (2016)Google Scholar
  35. 35.
    Ortega, M., Nigay, L.: AirMouse: finger gesture for 2D and 3D interaction. In: Gross, T., et al. (eds.) INTERACT 2009. LNCS, vol. 5727, pp. 214–227. Springer, Heidelberg (2009).  https://doi.org/10.1007/978-3-642-03658-3_28CrossRefGoogle Scholar
  36. 36.
    Parsons, L.M.: Inability to reason about an object’s orientation using an axis and angle of rotation. J. Exp. Psychol. Hum. Percept. Perform. 21(6), 1259–1277 (1995)CrossRefGoogle Scholar
  37. 37.
    Perelman, G., Serrano, M., Raynal, M., Picard, C., Derras, M., Dubois, E.: The roly-poly mouse: designing a rolling input device unifying 2D and 3D interaction. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul, Republic of Korea, 18–23 April 2015.  https://doi.org/10.1145/2702123.2702244]
  38. 38.
    Rautaray, S.S.: Real time hand gesture recognition system for dynamic applications. IJU Int. J. UbiComp 3(1), 21–31 (2012)CrossRefGoogle Scholar
  39. 39.
    Rudi, D., Giannopoulos, I., Kiefer, P., Peier, C., Raubal, M.: Interacting with maps on optical head-mounted displays. In: Proceedings of the 2016 Symposium on Spatial User Interaction (SUI 2016), pp. 3–12. ACM, New York (2016).  https://doi.org/10.1145/2983310.2985747
  40. 40.
    Saidi, H., Serrano, M., Irani, P., Dubois, E.: TDome: a touch-enabled 6DOF interactive device for multi-display environments. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI 2017), pp. 5892–5904. ACM, New York (2017).  https://doi.org/10.1145/3025453.3025661
  41. 41.
    Steven, S., Ishii, H., Schroder, P.: Immersive design of DNA molecules with a tangible interface. In: Proceedings of the Visualization, pp. 227–234. IEEE Computer Society, Los Alamitos (2004).  https://doi.org/10.1109/VISUAL.2004.47
  42. 42.
    Serrano, M., Ens, B.M., Irani, P.P.: Exploring the use of hand-to-face input for interacting with head-worn displays. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2014), pp. 3181–3190. ACM, New York (2014).  https://doi.org/10.1145/2556288.2556984
  43. 43.
    Song, P., Goh, W.B., Fu, C.-W., Meng, Q., Heng, P.-A.: WYSIWYF: exploring and annotating volume data with a tangible handheld device. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2011), pp. 1333–1342. ACM, New York (2011).  https://doi.org/10.1145/1978942.1979140
  44. 44.
    VandenBos, G.R.(ed): Publication Manual of the American Psychological Association, 6th edn. American Psychological Association, Washington, D.C. (2009)Google Scholar
  45. 45.
    Vo, D.-B., Lecolinet, E., Guiard, Y.: Belly gestures: body centric gestures on the abdomen. In: Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational (NordiCHI 2014), pp. 687–696. ACM, New York (2014).  https://doi.org/10.1145/2639189.2639210
  46. 46.
    Wagner, J., Nancel, M., Gustafson, S.G., Huot, S., Mackay, W.E.: Body-centric design space for multi-surface interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’13), pp. 1299–1308. ACM, New York (2013).  https://doi.org/10.1145/2470654.2466170
  47. 47.
    Wang, C.-Y., Chu, W.-C., Chiu, P.-T., Hsiu, M.-C., Chiang, Y.-H., Chen, M.Y.: PalmType: using palms as keyboards for smart glasses. In: Proceedings of the SIGCHI Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI 2015), pp. 153–160 (2015). http://dx.doi.org/10.1145/2785830.2785886
  48. 48.
    Ware, C., Arsenault, R.: Frames of reference in virtual object rotation. In: Proceedings of APGV 2004, pp. 135–141 (2004)Google Scholar
  49. 49.
    Wong, P.C., Zhu, K., Fu, H.: FingerT9: leveraging thumb-to-finger interaction for same-side-hand text entry on smartwatches. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, p. 178. ACM, April 2018Google Scholar
  50. 50.
    Zhai, S., Milgram, P.: Quantifying coordination in multiple DOF movement and its application to evaluating 6 DOF input devices. In: Karat, C.-M., Lund, A., Coutaz, J., Karat, J. (eds.) Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 1998), pp. 320–327. ACM Press/Addison-Wesley Publishing Co., New York, NY (1998).  https://doi.org/10.1145/274644.274689
  51. 51.
    Zhu, K., Ma, X., Chen, H., Liang, M.: Tripartite effects: exploring users’ mental model of mobile gestures under the influence of operation, handheld posture, and interaction space. Int. J. Hum.-Comput. Interact. 33(6), 443–459 (2017)CrossRefGoogle Scholar

Copyright information

© IFIP International Federation for Information Processing 2019

Authors and Affiliations

  • Houssem Saidi
    • 1
    Email author
  • Marcos Serrano
    • 1
  • Pourang Irani
    • 2
  • Christophe Hurter
    • 3
  • Emmanuel Dubois
    • 1
  1. 1.IRITUniversity of ToulouseToulouseFrance
  2. 2.Department of Computer ScienceUniversity of ManitobaWinnipegCanada
  3. 3.ENACFrench Civil Aviation UniversityToulouseFrance

Personalised recommendations