Abstract
Improving object manipulation skills through hand-object interaction exercises is crucial for rehabilitation. Despite limited healthcare resources, physical therapists propose remote exercise routines followed up by remote monitoring. However, remote motor skills assessment remains challenging due to the lack of effective motion visualizations. Therefore, exploring innovative ways of visualization is crucial, and virtual reality (VR) has shown the potential to address this limitation. However, it is unclear how VR visualization can represent understandable hand-object interactions. To address this gap, in this paper, we present VRMoVi, a VR visualization system that incorporates multiple levels of 3D visualization layers to depict movements. In a 2-stage study, we showed VRMoVi’s potential in representing hand-object interactions, with its visualization outperforming traditional representations, and detailed features improved the hand-object interactions understanding. This study takes the initial step in developing VR visualization of hand-object interaction to support remote physical therapy.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Openbrush. https://openbrush.app/. Accessed 21 Sep 2023
Plotly. https://plotly.com/. Accessed 21 Sep 2023
Python for unity. https://docs.unity3d.com. Accessed 21 Sep 2023
Braun, V., Clarke, V.: Thematic Analysis. In: APA Handbook of Research Methods in Psychology. American Psychological Association (2012)
Büschel, W., Lehmann, A., Dachselt, R.: MIRIA: a mixed reality toolkit for the in-situ visualization and analysis of spatio-temporal interaction data. In: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI) (2021)
Clark, R.A., et al.: Validity of the microsoft kinect for assessment of postural control (2012)
Dobkin, B.H., Dorsch, A.: The promise of mHealth: daily activity monitoring and outcome assessments by wearable sensors. Neurorehab. Neural Repair 25(9), 788–798 (2011)
Emery, K.J., Zannoli, M., Warren, J., Xiao, L., Talathi, S.S.: OpenNEEDS: a dataset of gaze, head, hand, and scene signals during exploration in open-ended VR environments. In: ACM Symposium on Eye Tracking Research and Applications (2021)
Jun, H., Shaik, H., DeVeaux, C., Lewek, M., Fuchs, H., Bailenson, J.: An Evaluation Study of 2D and 3D Teleconferencing for Remote Physical Therapy. Virtual and Augmented Reality, PRESENCE (2023)
Kepplinger, D., Wallner, G., Kriglstein, S., Lankes, M.: See, feel, move: player behaviour analysis through combined visualization of gaze, emotions, and movement. In: CHI Conference on Human Factors in Computing Systems (CHI) (2020)
Kloiber, S., et al.: Immersive analysis of user motion in VR applications. Vis. Comput. 36(10), 1937–1949 (2020). https://doi.org/10.1007/s00371-020-01942-1
Levin, M.F., Weiss, P.L., Keshner, E.A.: Emergence of virtual reality as a tool for upper limb rehabilitation: incorporation of motor control and motor learning principles. Phys. Ther. 95(3), 415–425 (2015)
Moya, S., Grau, S., Tost, D., Campeny, R., Ruiz, M.: Animation of 3D avatars for rehabilitation of the upper limbs. In: 2011 Third International Conference on Games and Virtual Worlds for Serious Applications (2011)
Obdrzálek, S., et al.: Accuracy and robustness of kinect pose estimation in the context of coaching of elderly population. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. 2012, 1188–1193 (2012)
Patel, S., Park, H., Bonato, P., Chan, L., Rodgers, M.: A review of wearable sensors and systems with application in rehabilitation. J. Neuroengineering Rehabil. 9, 21 (2012)
Pedregosa, F., et al.: Scikit-learn: machine learning in python. J. Mach. Learn. Res. 12, 2825–2830 (2011)
Ploderer, B., et al.: How therapists use visualizations of upper limb movement information from stroke patients: a qualitative study with simulated information. JMIR Rehabil. Assist. Technol. 3(2), e9 (2016)
Ploderer, B., et al.: ArmSleeve: a patient monitoring system to support occupational therapists in stroke rehabilitation. In: Proceedings of DIS 2016 (2016)
Postolache, O., Hemanth, D.J., Alexandre, R., Gupta, D., Geman, O., Khanna, A.: Remote monitoring of physical rehabilitation of stroke patients using IoT and virtual reality. IEEE J. Sel. Areas Commun. 39(2), 562–573 (2021)
Rado, A.S.D., Plasek, J., Nuckley, D., Keefe, D.F.: A real-time physical therapy visualization strategy to improve unsupervised patient rehabilitation. In: Poster: IEEE Visualization (2009)
Rawashdeh, S.A., Reimann, E., Uhl, T.L.: Highly-individualized physical therapy instruction beyond the clinic using wearable inertial sensors. IEEE Access 10, 14564–14574 (2022)
Reipschläger, P., Brudy, F., Dachselt, R., Matejka, J., Fitzmaurice, G., Anderson, F.: AvatAR: an immersive analysis environment for human motion data combining interactive 3D avatars and trajectories. In: Barbosa, S., et al. (eds.) Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (CHI) (2022)
Sacha, D., et al.: Dynamic visual abstraction of soccer movement. Comput. Graph. Forum 36(3), 305–315 (2017)
Winters, J.M., Winters, J.M.: Videoconferencing and telehealth technologies can provide a reliable approach to remote assessment and teaching without compromising quality. J. Cardiovasc. Nurs. 22(1), 51–7 (2007)
Acknowledgements
This research was funded by the 2022 Research Seed Fund of Dr. Trudi Qi, the 2020–22 Research Startup Fund of Dr. Franceli Cibrian, and the Robert A. Day Undergraduate Research Grant of Meghna Raswan at Chapman University.
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Qi, T.D., Boyd, L., Fitzpatrick, S., Raswan, M., Cibrian, F.L. (2023). Towards a Virtual Reality Visualization of Hand-Object Interactions to Support Remote Physical Therapy. In: Bravo, J., Urzáiz, G. (eds) Proceedings of the 15th International Conference on Ubiquitous Computing & Ambient Intelligence (UCAmI 2023). UCAmI 2023. Lecture Notes in Networks and Systems, vol 835. Springer, Cham. https://doi.org/10.1007/978-3-031-48306-6_14
Download citation
DOI: https://doi.org/10.1007/978-3-031-48306-6_14
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-48305-9
Online ISBN: 978-3-031-48306-6
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)