Skip to main content

Towards a Virtual Reality Visualization of Hand-Object Interactions to Support Remote Physical Therapy

  • Conference paper
  • First Online:
Proceedings of the 15th International Conference on Ubiquitous Computing & Ambient Intelligence (UCAmI 2023) (UCAmI 2023)

Abstract

Improving object manipulation skills through hand-object interaction exercises is crucial for rehabilitation. Despite limited healthcare resources, physical therapists propose remote exercise routines followed up by remote monitoring. However, remote motor skills assessment remains challenging due to the lack of effective motion visualizations. Therefore, exploring innovative ways of visualization is crucial, and virtual reality (VR) has shown the potential to address this limitation. However, it is unclear how VR visualization can represent understandable hand-object interactions. To address this gap, in this paper, we present VRMoVi, a VR visualization system that incorporates multiple levels of 3D visualization layers to depict movements. In a 2-stage study, we showed VRMoVi’s potential in representing hand-object interactions, with its visualization outperforming traditional representations, and detailed features improved the hand-object interactions understanding. This study takes the initial step in developing VR visualization of hand-object interaction to support remote physical therapy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 219.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 279.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Openbrush. https://openbrush.app/. Accessed 21 Sep 2023

  2. Plotly. https://plotly.com/. Accessed 21 Sep 2023

  3. Python for unity. https://docs.unity3d.com. Accessed 21 Sep 2023

  4. Braun, V., Clarke, V.: Thematic Analysis. In: APA Handbook of Research Methods in Psychology. American Psychological Association (2012)

    Google Scholar 

  5. Büschel, W., Lehmann, A., Dachselt, R.: MIRIA: a mixed reality toolkit for the in-situ visualization and analysis of spatio-temporal interaction data. In: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI) (2021)

    Google Scholar 

  6. Clark, R.A., et al.: Validity of the microsoft kinect for assessment of postural control (2012)

    Google Scholar 

  7. Dobkin, B.H., Dorsch, A.: The promise of mHealth: daily activity monitoring and outcome assessments by wearable sensors. Neurorehab. Neural Repair 25(9), 788–798 (2011)

    Google Scholar 

  8. Emery, K.J., Zannoli, M., Warren, J., Xiao, L., Talathi, S.S.: OpenNEEDS: a dataset of gaze, head, hand, and scene signals during exploration in open-ended VR environments. In: ACM Symposium on Eye Tracking Research and Applications (2021)

    Google Scholar 

  9. Jun, H., Shaik, H., DeVeaux, C., Lewek, M., Fuchs, H., Bailenson, J.: An Evaluation Study of 2D and 3D Teleconferencing for Remote Physical Therapy. Virtual and Augmented Reality, PRESENCE (2023)

    Google Scholar 

  10. Kepplinger, D., Wallner, G., Kriglstein, S., Lankes, M.: See, feel, move: player behaviour analysis through combined visualization of gaze, emotions, and movement. In: CHI Conference on Human Factors in Computing Systems (CHI) (2020)

    Google Scholar 

  11. Kloiber, S., et al.: Immersive analysis of user motion in VR applications. Vis. Comput. 36(10), 1937–1949 (2020). https://doi.org/10.1007/s00371-020-01942-1

    Article  Google Scholar 

  12. Levin, M.F., Weiss, P.L., Keshner, E.A.: Emergence of virtual reality as a tool for upper limb rehabilitation: incorporation of motor control and motor learning principles. Phys. Ther. 95(3), 415–425 (2015)

    Google Scholar 

  13. Moya, S., Grau, S., Tost, D., Campeny, R., Ruiz, M.: Animation of 3D avatars for rehabilitation of the upper limbs. In: 2011 Third International Conference on Games and Virtual Worlds for Serious Applications (2011)

    Google Scholar 

  14. Obdrzálek, S., et al.: Accuracy and robustness of kinect pose estimation in the context of coaching of elderly population. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. 2012, 1188–1193 (2012)

    Google Scholar 

  15. Patel, S., Park, H., Bonato, P., Chan, L., Rodgers, M.: A review of wearable sensors and systems with application in rehabilitation. J. Neuroengineering Rehabil. 9, 21 (2012)

    Google Scholar 

  16. Pedregosa, F., et al.: Scikit-learn: machine learning in python. J. Mach. Learn. Res. 12, 2825–2830 (2011)

    Google Scholar 

  17. Ploderer, B., et al.: How therapists use visualizations of upper limb movement information from stroke patients: a qualitative study with simulated information. JMIR Rehabil. Assist. Technol. 3(2), e9 (2016)

    Google Scholar 

  18. Ploderer, B., et al.: ArmSleeve: a patient monitoring system to support occupational therapists in stroke rehabilitation. In: Proceedings of DIS 2016 (2016)

    Google Scholar 

  19. Postolache, O., Hemanth, D.J., Alexandre, R., Gupta, D., Geman, O., Khanna, A.: Remote monitoring of physical rehabilitation of stroke patients using IoT and virtual reality. IEEE J. Sel. Areas Commun. 39(2), 562–573 (2021)

    Google Scholar 

  20. Rado, A.S.D., Plasek, J., Nuckley, D., Keefe, D.F.: A real-time physical therapy visualization strategy to improve unsupervised patient rehabilitation. In: Poster: IEEE Visualization (2009)

    Google Scholar 

  21. Rawashdeh, S.A., Reimann, E., Uhl, T.L.: Highly-individualized physical therapy instruction beyond the clinic using wearable inertial sensors. IEEE Access 10, 14564–14574 (2022)

    Google Scholar 

  22. Reipschläger, P., Brudy, F., Dachselt, R., Matejka, J., Fitzmaurice, G., Anderson, F.: AvatAR: an immersive analysis environment for human motion data combining interactive 3D avatars and trajectories. In: Barbosa, S., et al. (eds.) Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (CHI) (2022)

    Google Scholar 

  23. Sacha, D., et al.: Dynamic visual abstraction of soccer movement. Comput. Graph. Forum 36(3), 305–315 (2017)

    Google Scholar 

  24. Winters, J.M., Winters, J.M.: Videoconferencing and telehealth technologies can provide a reliable approach to remote assessment and teaching without compromising quality. J. Cardiovasc. Nurs. 22(1), 51–7 (2007)

    Google Scholar 

Download references

Acknowledgements

This research was funded by the 2022 Research Seed Fund of Dr. Trudi Qi, the 2020–22 Research Startup Fund of Dr. Franceli Cibrian, and the Robert A. Day Undergraduate Research Grant of Meghna Raswan at Chapman University.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Trudi Di Qi or Franceli L. Cibrian .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Qi, T.D., Boyd, L., Fitzpatrick, S., Raswan, M., Cibrian, F.L. (2023). Towards a Virtual Reality Visualization of Hand-Object Interactions to Support Remote Physical Therapy. In: Bravo, J., Urzáiz, G. (eds) Proceedings of the 15th International Conference on Ubiquitous Computing & Ambient Intelligence (UCAmI 2023). UCAmI 2023. Lecture Notes in Networks and Systems, vol 835. Springer, Cham. https://doi.org/10.1007/978-3-031-48306-6_14

Download citation

Publish with us

Policies and ethics