Abstract
A collaboration scenario involving a remote helper guiding in real time a local worker in performing a task on physical objects is common in a wide range of industries including health, mining and manufacturing. An established ICT approach to supporting this type of collaboration is to provide a shared visual space and some form of remote gesture. The shared space and remote gesture are generally presented in a 2D video form. Recent research in tele-presence has indicated that technologies that support co-presence and immersion not only improve the process of collaboration but also improve spatial awareness of the remote participant. We therefore propose a novel approach to developing a 3D system based on a 3D shared space and 3D hand gestures. A proof of concept system for remote guidance called HandsIn3D has been developed. This system uses a head tracked stereoscopic HMD that allows the helper to be immersed in the virtual 3D space of the worker’s workspace. The system captures in 3D the hands of the helper and fuses the hands into the shared workspace. This paper introduces HandsIn3D and presents a user study to demonstrate the feasibility of our approach.
Chapter PDF
Similar content being viewed by others
References
Alem, L., Tecchia, F., Huang, W.: HandsOnVideo: Towards a gesture based mobile AR system for remote collaboration. In: Recent Trends of Mobile Collaborative Augmented Reality, pp. 127–138 (2011)
Fussell, S.R., Setlock, L.D., Yang, J., Ou, J., Mauer, E., Kramer, A.D.I.: Gestures over video streams to support remote collaboration on physical tasks. Human-Computer Interaction 19, 273–309 (2004)
Gergle, D., Kraut, R.E., Fussell, S.R.: Using Visual Information for Grounding and Awareness in Collaborative Tasks. Human-Computer Interaction 28, 1–39 (2013)
Gurevich, P., Lanir, J., Cohen, B., Stone, R.: TeleAdvisor: a versatile augmented reality tool for remote assistance. In: CHI 2011, pp. 619–622 (2011)
Huang, W., Alem, L.: Gesturing in the Air: Supporting Full Mobility in Remote Collaboration on Physical Tasks. Journal of Universal Computer Science (2013)
Huang, W., Alem, L.: Supporting Hand Gestures in Mobile Remote Collaboration: A Usability Evaluation. In: Proceedings of the 25th BCS Conference on Human Computer Interaction (2011)
Kraut, R.E., Gergle, D., Fussell, S.R.: The Use of Visual Information in Shared Visual Spaces: Informing the Development of Virtual Co-Presence. In: CSCW 2002, pp. 31–40 (2002)
Mortensen, J., Vinayagamoorthy, V., Slater, M., Steed, A., Lok, B., Whitton, M.C.: Collaboration in tele-immersive environments. In: EGVE 2002, pp. 93–101 (2002)
Schuchardt, P., Bowman, D.A.: The benefits of immersion for spatial understanding of complex underground cave systems. In: VRST 2007, pp. 121–124 (2007)
Tecchia, F., Alem, L., Huang, W.: 3D helping hands: A gesture based MR system for remote collaboration. In: VRCAI 2012, pp. 323–328 (2012)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 IFIP International Federation for Information Processing
About this paper
Cite this paper
Huang, W., Alem, L., Tecchia, F. (2013). HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments. In: Kotzé, P., Marsden, G., Lindgaard, G., Wesson, J., Winckler, M. (eds) Human-Computer Interaction – INTERACT 2013. INTERACT 2013. Lecture Notes in Computer Science, vol 8117. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-40483-2_5
Download citation
DOI: https://doi.org/10.1007/978-3-642-40483-2_5
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-40482-5
Online ISBN: 978-3-642-40483-2
eBook Packages: Computer ScienceComputer Science (R0)