Skip to main content
Log in

2.5DHANDS: a gesture-based MR remote collaborative platform

  • ORIGINAL ARTICLE
  • Published:
The International Journal of Advanced Manufacturing Technology Aims and scope Submit manuscript

Abstract

Current remote collaborative systems in manufacturing are mainly based on video-conferencing technology. Their primary aim is to transmit manufacturing process knowledge between remote experts and local workers. However, it does not provide the experts with the same hands-on experience as when synergistically working on site in person. The mixed reality (MR) and increasing networking performances have the capacity to enhance the experience and communication between collaborators in geographically distributed locations. In this paper, therefore, we propose a new gesture-based remote collaborative platform using MR technology that enables a remote expert to collaborate with local workers on physical tasks. Besides, we concentrate on collaborative remote assembly as an illustrative use case. The key advantage compared to other remote collaborative MR interfaces is that it projects the remote expert’s gestures into the real worksite to improve the performance, co-presence awareness, and user collaboration experience. We aim to study the effects of sharing the remote expert’s gestures in remote collaboration using a projector-based MR system in manufacturing. Furthermore, we show the capabilities of our framework on a prototype consisting of a VR HMD, Leap Motion, and a projector. The prototype system was evaluated with a pilot study comparing with the POINTER (adding AR annotations on the task space view through the mouse), which is the most popular method used to augment remote collaboration at present. The assessment adopts the following aspects: the performance, user’s satisfaction, and the user-perceived collaboration quality in terms of the interaction and cooperation. Our results demonstrate a clear difference between the POINTER and 2.5DHANDS interface in the performance time. Additionally, the 2.5DHANDS interface was statistically significantly higher than the POINTER interface in terms of the awareness of user’s attention, manipulation, self-confidence, and co-presence.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Oyekan J, Prabhu V, Tiwari A, Baskaran V, Burgess M, Mcnally R (2017) Remote real-time collaboration through synchronous exchange of digitised human-workpiece interactions. Futur Gener Comput Syst 67:83–93

    Article  Google Scholar 

  2. Wang X, Ong SK, Nee AYC (2016) A comprehensive survey of augmented reality assembly research. Adv Manuf 4:1–22

    Article  Google Scholar 

  3. Ong SK, Yuan ML, Nee AYC (2008) Augmented reality applications in manufacturing: a survey. Int J Prod Res 46(10):2707–2742

    Article  MATH  Google Scholar 

  4. Nee AYC, Ong SK, Chryssolouris G, Mourtzis D (2012) Augmented reality applications in design and manufacturing. CIRP Ann Manuf Technol 61(2):657–679

    Article  Google Scholar 

  5. Wang Y, Zhang S, Yang S, He W, Bai X, Zeng Y (2017) A LINE-MOD-based markerless tracking approach for AR applications. Int J Adv Manuf Technol 89(5–8):1699–1707

    Article  Google Scholar 

  6. Doshi A, Smith RT, Thomas BH, Bouras C (2017) Use of projector based augmented reality to improve manual spot-welding precision and accuracy for automotive manufacturing. Int J Adv Manuf Technol 89:1279–1293

    Article  Google Scholar 

  7. Uva AE, Gattullo M, Manghisi VM, Spagnulo D, Cascella GL, Fiorentino M (2018) Evaluating the effectiveness of spatial augmented reality in smart manufacturing: a solution for manual working stations. Int J Adv Manuf Technol 94:509–521

    Article  Google Scholar 

  8. Wang Y, Zhang S, Wan B, He W, Bai X (2018) Point cloud and visual feature-based tracking method for an augmented reality-aided mechanical assembly system. Int J Adv Manuf Technol

  9. Matsas E, Vosniakos G, Batras D (2017) Effectiveness and acceptability of a virtual environment for assessing human–robot collaboration in manufacturing. Int J Adv Manuf Technol 92:3903–3917

    Article  Google Scholar 

  10. Wang S, Parsons M, Stonemclean J, Rogers P, Boyd S, Hoover K, Meruvia-Pastor O, Gong M, Smith A (2017) Augmented reality as a telemedicine platform for remote procedural training. Sensors 17(10):2294

    Article  Google Scholar 

  11. Anton D, Kurillo G, Bajcsy R (2017) User experience and interaction performance in 2D/3D telecollaboration. Futur Gener Comput Syst 82:77–88

    Article  Google Scholar 

  12. Gurevich P, Lanir J, Cohen B (2015) Design and implementation of TeleAdvisor: a projection-based augmented reality system for remote collaboration. CSCW 24(6):527–562

    Google Scholar 

  13. Anton D, Kurillo G, Yang AY, Bajcsy R (2017) Augmented telemedicine platform for real-time remote medical consultation. In: MultiMedia modeling. Springer International Publishing, pp 77–89

  14. Piumsomboon T, Lee GA, Hart JD, Ens B, Lindeman RW, Thomas BH, Billinghurst M (2018) Mini-Me: An adaptive avatar for mixed reality remote collaboration. In: Proceedings of the 2018 CHI conference on human factors in computing systems, vol 46. ACM, pp1–13

  15. Gurevich P, Lanir J, Cohen B, Cohen B, Stone R (2012) TeleAdvisor:a versatile augmented reality tool for remote assistance. In: Sigchi conference on human factors in computing systems. ACM, pp 619–622

  16. Gergle D, Kraut RE, Fussell SR (2013) Using visual information for grounding and awareness in collaborative tasks. Hum-Comput Interact 28(1):1–39

    Google Scholar 

  17. Ranjan A, Birnholtz JP, Balakrishnan R (2007) Dynamic shared visual spaces: experimenting with automatic camera control in a remote repair task. In: Sigchi conference on human factors in computing systems. ACM pp 1177–1186

  18. Fussell SR, Setlock LD, Kraut RE (2003) Effects of head-mounted and scene-oriented video systems on remote collaboration on physical tasks. In: Conference on human factors in computing systems. ACM, pp 513–520

  19. Gao L, Bai H, Lee G, Billinghurst M (2016) An oriented point-cloud view for MR remote collaboration. In: SIGGRAPH ASIA 2016 mobile graphics and interactive applications. ACM, p 8

  20. Huang W, Alem L, Tecchia F, Duh HB (2017) Augmented 3D hands: a gesture-based mixed reality system for distributed collaboration. J Multimodal User In 2:1–13

    Google Scholar 

  21. Tecchia F, Alem L, Huang W (2012) 3D helping hands:a gesture based MR system for remote collaboration. In: ACM SIGGRAPH International Conference on Virtual-Reality Continuum and ITS Applications in Industry. ACM (VRCAI), pp 323–328

  22. Huang W, Alem L (2013) HandsinAir: a wearable system for remote collaboration on physical tasks. In: Proceedings of the 2013 conference on computer supported cooperative work companion. ACM (CSCW), pp 153–156

  23. O'Neill J, Castellani S, Roulland F, Juliano C, Dai L, Roulland F, Hairon N (2011) From ethnographic study to mixed reality: a remote collaborative troubleshooting system. In: ACM 2011 conference on computer supported cooperative work. ACM (CSCW), pp 225–234

  24. Kirk D, Rodden T (2007) Turn it this way:grounding collaborative action with remote gestures. In: conference on human factors in computing systems, CHI 2007, San Jose, California, USA, April 28 - May. DBLP, pp 1039–1048

  25. Fussell SR, Kraut RE, Siegel J (2000) Coordination of communication:effects of shared visual context on collaborative work. In: Acm conference on Computer Supported Cooperative Work (CSCW). pp 21–30

  26. Kraut RE, Fussell SR, Siegel J (2003) Visual information as a conversational resource in collaborative physical tasks. Hum Comput Interact 18(1–2):13–49

    Article  Google Scholar 

  27. Tait M, Billinghurst M (2015) The effect of view Independence in a collaborative AR system. Comput Supported Coop Work J 24(6):563–589

    Article  Google Scholar 

  28. Tait M, Tsai T, Sakata N, Billinghurst M, Vartiainen E (2013) A projected augmented reality system for remote collaboration. IEEE international symposium on mixed and augmented reality (ISMAR). IEEE, pp 1–6

  29. Fussell SR, Setlock LD, Parker EM, Yang J (2003) Assessing the value of a cursor pointing device for remote collaboration on physical tasks. In: Extended Abstracts of the 2003 Conference on Human Factors in Computing Systems, CHI 2003, Ft. Lauderdale, Florida, USA, April. DBLP, pp 788–789

  30. Kim S, Lee G, Sakata N, Billinghurst M (2014). Improving co-presence with augmented visual communication cues for sharing experience through video conference. In: IEEE international symposium on mixed and augmented reality (ISMAR) IEEE, pp 83–92

  31. Andrist S, Gleicher M, Mutlu B (2017) Looking Coordinated: Bidirectional Gaze Mechanisms for Collaborative Interaction with Virtual Characters. In: CHI conference on human factors in computing systems. ACM, pp 2571–2582

  32. Gupta K, Lee GA, Billinghurst M (2016) Do you see what I see? The effect of gaze tracking on task space remote collaboration. IEEE Trans Vis Comput Graph 22(11):2413–2422

    Article  Google Scholar 

  33. Wang P, Zhang S, Bai X, Billinghurst M, He W, Zhang L, Du J, Wang S (2018) [POSTER]Do You Know What I Mean? An MR-based Collaborative Platform. In: IEEE International Symposium on Mixed and Augmented Reality (ISMAR). https://www.ismar2018.org/papers/ismar2018_poster_1044.html

  34. Kuzuoka H, Oyama S, Yamazaki K, Mitsuishi M (2000) GestureMan:a mobile robot that embodies a remote instructor's actions. In: CSCW'00: Proc. 2000 ACM conference on computer supported cooperative work (CSCW). ACM Press, New York, pp 155–162

  35. Ou J, Chen X, Fussell SR, Yang J (2003) DOVE: drawing over video environment. In: Eleventh ACM International Conference on Multimedia. ACM, pp100–101

  36. Fussell SR, Setlock LD, Yang J, Ou J, Mauer E, Kramer ADI (2004) Gestures over video streams to support remote collaboration on physical tasks. Hum Comput Interact 19(3):273–309

    Article  Google Scholar 

  37. Kirk D, Crabtree A, Rodden T (2005) Ways of the hands. In: ECSCW 2005. Springer Netherlands, pp 1–21

  38. Kirk D, Fraser D S (2006) Comparing remote gesture technologies for supporting collaborative physical tasks. In: Conference on human factors in computing systems, CHI 2006, Montréal, Québec, Canada, April DBLP, pp 1191–1200

  39. Li J, Wessels A, Alem L, Stitzlein C (2007) Exploring interface with representation of gesture for remote collaboration. Ozchi 179–182

  40. Alem L, Tecchia F, Huang W (2011) HandsOnVideo: towards a gesture based Mobile AR system for remote collaboration. In: Recent trends of mobile collaborative augmented reality systems. Springer, New York, pp 135–148

  41. Alem L, Li J (2011) A study of gestures in a video-mediated collaborative assembly task. Int J Hum Comput Interact 2011(3):1

    Google Scholar 

  42. Isaacs EA, Tang JC (1993) What video can and can’t do for collaboration: a case study. In: ACM International conference on multimedia. ACM, pp 199–206

  43. Lee G, Kim S, Lee Y, Dey A (2017) [POSTER] mutually shared gaze in augmented video conference. In: IEEE international symposium on mixed and augmented reality (ISMAR). IEEE, pp 79–80

  44. Brooke J (1996) SUeS—a quick and dirty usability scale. In: Usability evaluation in industry

  45. Harms C, Biocca F (2004) Internal consistency and reliability of the networked minds measure of social presence. In: Alcanizm, Rey, Seventh international workshop: presence

Download references

Acknowledgments

We would like to thank Yuming Zheng for donating his personal water pump for our research, and Yue Wang for checking the English of an early version, and his constructive comments are gratefully acknowledged which have helped the author to improve the paper. Besides, we would like to appreciate the anonymous reviewers for their constructive suggestions for enhancing this paper. Specifically, we thank our schoolfellow for their contribution to this research: Li Zhang for the science lead and Dechuan Han for technical realization, Jiaxiang Du for the experimental data collection. Moreover, we want to thank for Shuxiang Wang’s constructive suggestions for improving the experiment. We would also like to thank members of the Northwestern Polytechnical University for their participation in the experiment.

Funding

This research was sponsored by the civil aircraft special project (MJZ-2017-G73) and the seed foundation of innovation and creation for graduate students in the Northwestern Polytechnical University (ZZ2018084).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Shusheng Zhang or Xiaoliang Bai.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, P., Zhang, S., Bai, X. et al. 2.5DHANDS: a gesture-based MR remote collaborative platform. Int J Adv Manuf Technol 102, 1339–1353 (2019). https://doi.org/10.1007/s00170-018-03237-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00170-018-03237-1

Keywords

Navigation