Advertisement

Extending Embodied Interactions in Mixed Reality Environments

  • Mohamed Handosa
  • Hendrik Schulze
  • Denis GračaninEmail author
  • Matthew Tucker
  • Mark Manuel
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10909)

Abstract

The recent advances in mixed reality (MR) technologies provide a great opportunity to support deployment and use of MR applications for training and education. Users can interact with virtual objects that can help them be more engaged and acquire more information compared to the more traditional approaches. MR devices, such as the Microsoft HoloLens device, use spatial mapping to place virtual objects in the surrounding space and support embodied interaction with those objects. However, some applications may require an extended range of embodied interactions that are beyond the capabilities of the MR device. For instance, interaction with virtual objects using arms, legs, and body almost the same way we interact with physical objects. We describe an approach to extend the functionality of Microsoft HoloLens to support an extended range of embodied interactions in an MR space by using the Microsoft Kinect V2 sensor device. Based on that approach, we developed a system that maps the captured skeletal data from the Kinect device to the HoloLens device coordinate system. We have measured the overall delay of the developed system to evaluate its effect on application responsiveness. The described system is currently being used for the development of a HoloLens application for nurse aide certification in the Commonwealth of Virginia.

Keywords

Mixed reality User gestures Tracking 

References

  1. 1.
    Arlitt, M., Marwah, M., Bellala, G., Shah, A., Healey, J., Vandiver, B.: MQTT version 3.1.1 plus errata 01. Standard, OASIS, 10 December 2015Google Scholar
  2. 2.
    Bacca, J., Baldiris, S., Fabregat, R., Graf, S., Kinshuk: Augmented reality trends in education: a systematic review of research and applications. J. Educ. Technol. Soc. 17(4), 133–149 (2014)Google Scholar
  3. 3.
    Besl, P.J., McKay, N.D.: A method for registration of 3-D shapes. IEEE Trans. Pattern Anal. Mach. Intell. 14(2), 239–256 (1992)CrossRefGoogle Scholar
  4. 4.
    Billinghurst, M., Kato, H.: Collaborative mixed reality. In: Proceedings of the First International Symposium on Mixed Reality (ISMR 1999), pp. 261–284. Springer, Berlin (1999)CrossRefGoogle Scholar
  5. 5.
    Bimber, O., Raskar, R.: Spatial Augmented Reality: Merging Real and Virtual Worlds. A K Peters, Wellesley (2005)CrossRefGoogle Scholar
  6. 6.
    Calvo, O., Molina, J.M., Patricio, M.A., Berlanga, A.: A propose architecture for situated multi-agent systems and virtual simulated environments applied to educational immersive experiences. In: Ferrández Vicente, J.M., Álvarez-Sánchez, J.R., de la Paz López, F., Toledo Moreo, J., Adeli, H. (eds.) IWINAC 2017. LNCS, vol. 10338, pp. 413–423. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-59773-7_42CrossRefGoogle Scholar
  7. 7.
    Chen, L., Day, T.W., Tang, W., John, N.W.: Recent developments and future challenges in medical mixed reality. In: Proceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 123–135, October 2017Google Scholar
  8. 8.
    Dourish, P.: Where the Action is: The Foundations of Embodied Interaction. The MIT Press, Cambridge (2001)Google Scholar
  9. 9.
    Dourish, P.: Re-space-ing place: “place” and “space” ten years on. In: Proceedings of the 2006 20th Anniversary Conference on Computer Supported Cooperative Work (CSCW 2006), pp. 299–308. ACM, New York, 4–8 November 2006Google Scholar
  10. 10.
    Dunleavy, M., Dede, C., Mitchell, R.: Affordances and limitations of immersive participatory augmented reality simulations for teaching and learning. J. Sci. Educ. Technol. 18(1), 7–22 (2009)CrossRefGoogle Scholar
  11. 11.
    Fishkin, K.P., Moran, T.P., Harrison, B.L.: Embodied user interfaces: towards invisible user interfaces. In: Proceedings of the IFIP TC2/TC13 WG2.7/WG13.4 Seventh Working Conference on Engineering for Human-Computer Interaction, pp. 1–18. Kluwer, B.V., Deventer (1998)Google Scholar
  12. 12.
    Gračanin, D., Eck II, T., Silverman, R., Heivilin, A., Meacham, S.: An approach to embodied interactive visual steering: bridging simulated and real worlds. In: Proceedings of the 2014 Winter Simulation Conference (WSC), pp. 4073–4074, December 2014Google Scholar
  13. 13.
    Gračanin, D., Handosa, M., Elmongui, H., Matković, K.: An approach to user interactions with IoT-enabled spaces. In: Proceedings of the 14th International Conference on Telecommunications (ConTEL), pp. 139–146, 28–30 June 2017Google Scholar
  14. 14.
    Hartley, R.I.: In defense of the eight-point algorithm. IEEE Trans. Pattern Anal. Mach. Intell. 19(6), 580–593 (1997)CrossRefGoogle Scholar
  15. 15.
    HoloGroup: Education in mixed reality. http://holo.group. Accessed 6 Jan 2018
  16. 16.
    Kirsh, D.: Embodied cognition and the magical future of interaction design. ACM Trans. Comput.-Hum. Interact. 20(1), 3:1–3:30 (2013)CrossRefGoogle Scholar
  17. 17.
    Klemmer, S.R., Hartmann, B., Takayama, L.: How bodies matter: five themes for interaction design. In: Proceedings of the 6th ACM conference on Designing Interactive systems (DIS 2006), pp. 140–149. ACM Press, New York (2006)Google Scholar
  18. 18.
    Lebeck, K., Kohno, T., Roesner, F.: How to safely augment reality: challenges and directions. In: Proceedings of the 17th International Workshop on Mobile Computing Systems and Applications, pp. 45–50. ACM, New York (2016)Google Scholar
  19. 19.
    Microsoft Corporation: How to perform an air tap. https://developer.microsoft.com/en-us/windows/mixed-reality/gestures. Accessed 2 Mar 2018
  20. 20.
    Microsoft Corporation: Microsoft HoloLens. https://www.microsoft.com/microsoft-hololens/. Accessed 2 Mar 2018
  21. 21.
    Noteborn, G., Carbonell, K.B., Dailey-Hebert, A., Gijselaers, W.: The role of emotions and task significance in virtual education. Internet High. Educ. 15(3), 176–183 (2012)CrossRefGoogle Scholar
  22. 22.
    ODG: Osterhout Design Group. https://www.osterhoutgroup.com/. Accessed 2 Mar 2018
  23. 23.
    Pearson: Virginia Nurse Aide: written (or oral) examination & skills evaluation, March 2017Google Scholar
  24. 24.
    Peters, E., Heijligers, B., de Kievith, J., Razafindrakoto, X., van Oosterhout, R., Santos, C., Mayer, I., Louwerse, M.: Design for collaboration in mixed reality: technical challenges and solutions. In: Proceedings of the 8th International Conference on Games and Virtual Worlds for Serious Applications (VS-GAMES), pp. 1–7, September 2016Google Scholar
  25. 25.
    Piumsomboon, T., Lee, Y., Lee, G.A., Dey, A., Billinghurst, M.: Empathic mixed reality: sharing what you feel and interacting with what you see. In: Proceedings of the 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR), pp. 38–41, June 2017Google Scholar
  26. 26.
    Quarles, J., Lampotang, S., Fischler, I., Fishwick, P., Lok, B.: Collocated AAR: augmenting after action review with mixed reality. In: Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality (ISMAR 2008), pp. 107–116, September 2008Google Scholar
  27. 27.
    Rakkolainen, I., Turk, M., Höllerer, T.: A compact, wide-FOV optical design for head-mounted displays. In: Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology, pp. 293–294. ACM, New York (2016)Google Scholar
  28. 28.
    Ren, D., Goldschwendt, T., Chang, Y., Höllerer, T.: Evaluating wide-field-of-view augmented reality with mixed reality simulation. In: Proceedings of the 2016 IEEE Virtual Reality Conference (VR 2016), pp. 93–102. IEEE (2016)Google Scholar
  29. 29.
    Volonte, M., Babu, S.V., Chaturvedi, H., Newsome, N., Ebrahimi, E., Roy, T., Daily, S.B., Fasolino, T.: Effects of virtual human appearance fidelity on emotion contagion in affective inter-personal simulations. IEEE Trans. Vis. Comput. Graph. 22(4), 1326–1335 (2016)CrossRefGoogle Scholar
  30. 30.
    Williams, A., Kabisch, E., Dourish, P.: From interaction to participation: configuring space through embodied interaction. In: Beigl, M., Intille, S., Rekimoto, J., Tokuda, H. (eds.) UbiComp 2005. LNCS, vol. 3660, pp. 287–304. Springer, Heidelberg (2005).  https://doi.org/10.1007/11551201_17CrossRefGoogle Scholar
  31. 31.
    Wilson, M.: Six views of embodied cognition. Psychono. Bull. Rev. 9(4), 625–636 (2002)CrossRefGoogle Scholar
  32. 32.
    Xiao, R., Benko, H.: Augmenting the field-of-view of head-mounted displays with sparse peripheral displays. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 1221–1232. ACM (2016)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Mohamed Handosa
    • 1
  • Hendrik Schulze
    • 1
  • Denis Gračanin
    • 1
    Email author
  • Matthew Tucker
    • 1
  • Mark Manuel
    • 1
  1. 1.Department of Computer ScienceVirginia TechBlacksburgUSA

Personalised recommendations