Advertisement

Designing an Augmented Reality Multimodal Interface for 6DOF Manipulation Techniques

Multimodal Fusion Using Gesture and Speech Input for AR
  • Ajune Wanis IsmailEmail author
  • Mark Billinghurst
  • Mohd Shahrizal Sunar
  • Cik Suhaimi Yusof
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 868)

Abstract

Augmented Reality (AR) supports natural interaction in physical and virtual worlds, so it has recently given rise to a number of novel interaction modalities. This paper presents a method for using hand-gestures with speech input for multimodal interaction in AR. It focuses on providing an intuitive AR environment which supports natural interaction with virtual objects while sustaining accessible real tasks and interaction mechanisms. The paper reviews previous multimodal interfaces and describes recent studies in AR that employ gesture and speech inputs for multimodal input. It describes an implementation of gesture interaction with speech input in AR for virtual object manipulation. Finally, the paper presents a user evaluation of the technique, showing that it can be used to improve the interaction between virtual and physical elements in an AR environment.

Keywords

Augmented reality Multimodal interface Hand gesture Speech input Human computer interaction 

Notes

Acknowledgment

We would to express our appreciation to Universiti Teknologi Malaysia (UTM) for the funding and supports. We also thank the Human Interface Technology Laboratory New Zealand (HITLabNZ) at University of Canterbury. This work was funded by UTM GUP Funding Scheme.

References

  1. 1.
    Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., Blair, M.: Recent advances in augmented reality. IEEE Comput. Graph. Appl., 20–38 (2001)Google Scholar
  2. 2.
    Zhou, F., Duh, H.B.-L., Billinghurst, M.: Trends in augmented reality tracking, interaction and display: a review of ten years of ISMAR. In: Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality. IEEE Computer Society (2008)Google Scholar
  3. 3.
    Kaiser, E., Olwal, A., McGee, D., Benko, H., Corradini, A., Li, X., Cohen, P., Feiner, S.: Mutual disambiguation of 3D multimodal interaction in augmented and virtual reality. In: ICMI 2003: International Conference on Multimodal Interfaces, pp. 12–19, August 2003Google Scholar
  4. 4.
    Jewitt, C.: Technology, Literacy and Learning: A Multimodal Approach. Psychology Press, London (2006)Google Scholar
  5. 5.
    Lim, C.J., Pan, Y., Lee, J.: Human factors and design issues in multimodal (speech/gesture) interface. JDCTA 2(1), 67–77 (2008)Google Scholar
  6. 6.
    Corradini, A., Cohen, P.: On the relationships among speech, gestures, and object manipulation in virtual environments: initial evidence. In: Proceedings of the International CLASS Workshop on Natural, Intelligent and Effective Interaction in Multimodal Dialogue Systems, pp. 52–61 (2002)Google Scholar
  7. 7.
    Lee, H., Billinghurst, M., Woo, W.: Two-handed tangible interaction techniques for composing augmented blocks. Virtual Reality 15(2–3), 133–146 (2011)CrossRefGoogle Scholar
  8. 8.
    Looser, J., Billinghurst, M., Cockburn, A.: Through the looking glass: the use of lenses as an interface tool for augmented reality interfaces. In: Proceedings of the 2nd International Conference on Computer Graphics and Interactive Techniques in Australasia and South East Asia, pp. 204–211. ACM, June 2004Google Scholar
  9. 9.
    Kato, H., Tachibana, K., Tanabe, M., Nakajima, T., Fukuda, Y.: MagicCup: a tangible interface for virtual objects manipulation in table-top augmented reality. In: 2003 IEEE International Augmented Reality Toolkit Workshop, pp. 75–76. IEEE, October 2003Google Scholar
  10. 10.
    Haller, M., Billinghurst, M., Thomas, B.H. (eds.): Emerging Technologies of Augmented Reality: Interfaces and Design. IGI Global, Hershey (2007)Google Scholar
  11. 11.
    Lee, M.: Multimodal Speech-Gesture Interaction with 3D Objects in Augmented Reality Environments (2010)Google Scholar
  12. 12.
    Leap Motion: Leap Motion Controller (2017). https://developer.leapmotion.com/. Accessed Jan 2017
  13. 13.
  14. 14.
    Ismail, A.W., Sunar, M.S.: Multimodal fusion: gesture and speech input in augmented reality environment. In: Computational Intelligence in Information Systems: Proceedings of the Fourth INNS Symposia Series on Computational Intelligence in Information Systems (INNS-CIIS 2014), vol. 331, p. 245. Springer, November 2014Google Scholar
  15. 15.
    Piumsomboon, T., Altimira, D., Kim, H., Clark, A., Lee, G., Billinghurst, M.: Grasp-Shell vs gesture-speech: a comparison of direct and indirect natural interaction techniques in augmented reality. In: 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 73–82. IEEE (2014)Google Scholar
  16. 16.
    Ismail, A.W., Sunar, M.S.: Intuitiveness 3D objects interaction in augmented reality using S-PI algorithm. TELKOMNIKA Indonesian J. Electr. Eng. 11(7), 3561–3567 (2013)Google Scholar
  17. 17.
    Olwal, A., Benko, H., Feiner, S.: SenseShapes: using statistical geometry for object selection in a multimodal augmented reality system. In: Proceedings of the Second IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR 2003), Tokyo, Japan, 7–10 October 2003, pp. 300–301 (2003)Google Scholar
  18. 18.
    Heidemann, G., Bax, I., Bekel, H.: Multimodal interaction in an augmented reality scenario. In: ICMI 2004: Proceedings of International Conference on Multimodal Interfaces, pp. 53–60 (2004)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Ajune Wanis Ismail
    • 1
    Email author
  • Mark Billinghurst
    • 2
  • Mohd Shahrizal Sunar
    • 3
  • Cik Suhaimi Yusof
    • 1
  1. 1.School of Computing, Faculty of EngineeringUniversiti Teknologi MalaysiaSkudaiMalaysia
  2. 2.Empathic Computing LaboratoryUniversity of South AustraliaMawson LakesAustralia
  3. 3.UTM-IRDA Digital Media Centre, MaGICX (Media and Game Innovation Centre of Excellence), Institute of Human Centred EngineeringUniversiti Teknologi MalaysiaSkudaiMalaysia

Personalised recommendations