Skip to main content

HandsOnVideo: Towards a Gesture based Mobile AR System for Remote Collaboration

Abstract

There is currently a strong need for collaborative augmented reality (AR) systems with which two or more participants interact over a distance on a mobile task involving tangible artefacts (e.g., a machine, a patient, a tool) and coordinated operations. Of interest to us is to design and develop an AR system for supporting a mobile worker involved in the maintenance/ repair of complex mining equipment. This paper presents HandsOnVideo, our on-going work towards a gesture based mobile AR system for remote collaboration. HandsOnVideo is designed and developed using a ­participatory design approach. Following this approach, we learnt that helpers found it natural to use their hands for pointing to objects and explaining procedures and that workers found it valuable to see the hands of the person guiding them remotely. On the other hand, we observed that this form of hand gesture interaction supported by HandsOnVideo resulted in network latency. This paper describes some of the design tradeoffs we came across during the design process and tested with representative end users. These tradeoffs will be further investigated as part of our research agenda.

Keywords

  • Augmented Reality
  • Video Stream
  • Hand Gesture
  • Task Space
  • Panoramic View

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-1-4419-9845-3_11
  • Chapter length: 14 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   149.00
Price excludes VAT (USA)
  • ISBN: 978-1-4419-9845-3
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   199.00
Price excludes VAT (USA)
Hardcover Book
USD   199.99
Price excludes VAT (USA)
Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

References

  1. Palmer, D., Adcock, M., Smith, J., Hutchins, M., Gunn, C., Stevenson, D., and Taylor, K. (2007) Annotating with light for remote guidance. In Proceedings of the 19th Australasian Conference on Computer-Human interaction: Entertaining User interfaces (Adelaide, Australia, November 28 - 30, 2007). OZCHI ’07, vol. 251. ACM, New York, NY, 103–110.

    Google Scholar 

  2. Kraut, R. E., Miller, M. D. and Siegel, J. (1996) Collaboration in performance of physical tasks: effects on outcomes and communication. CSCW ’96: Proceedings of the 1996 ACM conference on Computer supported cooperative work, ACM, 1996, 57–66.

    Google Scholar 

  3. Kraut, R. E., Fussell, S. R., and Siegel, J. (2003) Visual information as a conversational resource in collaborative physical tasks. Hum.-Comput. Interact. 18, 1 (Jun. 2003), 13–49.

    CrossRef  Google Scholar 

  4. Li, J., Wessels, A., Alem, L., and Stitzlein, C. (2007) Exploring interface with representation of gesture for remote collaboration. In Proceedings of the 19th Australasian Conference on Computer-Human interaction: Entertaining User interfaces (Adelaide, Australia, November 28 - 30, 2007). OZCHI ’07, vol. 251. ACM, New York, NY, 179–182.

    Google Scholar 

  5. Clark, H. H. and Brennan (1991) Grounding in communication. Perspectives on socially shared cognition. American Psychological Association, 1991.

    Google Scholar 

  6. Fussell, S. R., Setlock, L. D., Yang, J., Ou, J., Mauer, E., and Kramer, A. D. I. (2004) Gestures over video streams to support remote collaboration on physical tasks. Hum.-Comput. Interact., L. Erlbaum Associates Inc., 19, 273–309.

    Google Scholar 

  7. Kirk, D. and Stanton Fraser, D. (2006) Comparing remote gesture technologies for supporting collaborative physical tasks. CHI ’06: Proceedings of the SIGCHI conference on Human Factors in computing systems, ACM, 1191–1200.

    Google Scholar 

  8. Kuzuoka, H., Kosaka, J., Yamazaki, K., Suga, Y., Yamazaki, A., Luff, P., and Heath, C. (2004) Mediating dual ecologies. CSCW ’04: Proceedings of the 2004 ACM conference on Computer supported cooperative work, ACM, 477–486.

    Google Scholar 

  9. Kurata, T., Sakata, N., Kourogi, M., Kuzuoka, H., and Billinghurst, M. (2004) Remote collaboration using a shoulder-worn active camera/laser. Eighth International Symposium on Wearable Computers (ISWC’04), 1, 62–69.

    CrossRef  Google Scholar 

  10. Sakata, N., Kurata, T., Kato, T., Kourogi, M., and Kuzuoka, H. (2003) WACL: supporting telecommunications using - wearable active camera with laser pointer Wearable Computers, 2003. Proceedings. Seventh IEEE International Symposium on, 2003, 53–56.

    Google Scholar 

  11. Langlotz, T., Wagner, D., Mulloni, A., and Schmalstieg, D. (2010) Online Creation of Panoramic Augmented Reality Annotations on Mobile Phones. IEEE Pervasive Computing, 10 Aug. 2010. IEEE computer Society Digital Library. IEEE Computer Society.

    Google Scholar 

  12. R.E.A.L. (REmote Assistance for Lines), (c)(TM) SIDEL S.p.a. & VRMedia S.r.l, http://www.vrmedia.it/Real.htm.

  13. Lapkin et al. (2009) Hype Cycle for Context-Aware Computing. Gartner research report, 23 July 2009. ID Number: G00168774.

    Google Scholar 

  14. Carrozzino M., Tecchia F., Bacinelli S., and Bergamasco M. (2005) Lowering the Development Time of Multimodal Interactive Application: The Real-life Experience of the XVR project, Proceedings of ACM SIGCHI International Conference on Advances in Computer Entertainment Technology (ACE’05), Valencia, Spain.

    Google Scholar 

  15. The Google WebM project, http://www.webmproject.org/, Google Inc.

  16. Ranjan, A., Birnholtz, J. P., and Balakrishnan, R. (2007) Dynamic shared visual spaces: experimenting with automatic camera control in a remote repair task. CHI’07: Proceedings of the SIGCHI conference on Human factors in computing systems, ACM, 1177–1186.

    Google Scholar 

  17. Huang, W. and Alem, L. (2011) Supporting Hand Gestures in Mobile Remote Collaboration: A Usability Evaluation. In Proceedings of the 25th BCS Conference on Human Computer Interaction, 4 July – 8th July, 2011, in Newcastle-upon-Tyne, UK.

    Google Scholar 

Download references

Acknowledgements

We would like to thank the Future mine research theme of the CSIRO Minerals Down Under flagship for sponsoring and supporting this research effort.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Leila Alem .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2011 Springer Science+Business Media, LLC

About this paper

Cite this paper

Alem, L., Tecchia, F., Huang, W. (2011). HandsOnVideo: Towards a Gesture based Mobile AR System for Remote Collaboration. In: Alem, L., Huang, W. (eds) Recent Trends of Mobile Collaborative Augmented Reality Systems. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-9845-3_11

Download citation

  • DOI: https://doi.org/10.1007/978-1-4419-9845-3_11

  • Published:

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-1-4419-9844-6

  • Online ISBN: 978-1-4419-9845-3

  • eBook Packages: Computer ScienceComputer Science (R0)