Skip to main content
Log in

PE-DLS: a novel method for performing real-time full-body motion reconstruction in VR based on Vive trackers

  • Original Article
  • Published:
Virtual Reality Aims and scope Submit manuscript

Abstract

Real-time full-body motion capture (MoCap) is becoming necessary for enabling natural interactions and creating deeper immersion in virtual reality (VR). To reduce the cost and complexity of MoCap systems, some studies attempt to track only the joint data of root and end effectors and reconstruct full-body motion by solving inverse kinematics (IK) problems. However, ensuring the accuracy of full-body motion reconstruction in real-time is challenging because the problem is inherently under-constrained. In this paper, we propose PE-DLS, a novel method to perform full-body motion reconstruction in two stages: pose estimation (PE) and damped least squares (DLS) optimization. First, we use analytical IK solvers to estimate the spine and limbs in sequence. To further improve model accuracy, we use the DLS method to optimize the results obtained from PE. To evaluate the model performance, we compare it with other methods in terms of the reconstruction error and computational time of full-body reconstruction via testing on publicly available datasets. These results indicate that PE-DLS outperforms other methods in terms of the mean per joint position error (2.11 cm) and mean per joint rotation error (10.75°) with low time cost (1.65 ms per frame). Furthermore, we implement a full-body MoCap system based on an HTC Vive headset and five Vive trackers. Live demos and qualitative comparisons show that our system achieves comparable quality to the commercial MoCap system. With high accuracy and low time cost, PE-DLS contributes to construct a real-time MoCap system in VR.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

Availability of data and material

All data generated or analysed during this study are included in this published article.

Code availability

The source code is available at GitHub: https://github.com/A-Qiang/VRMoCap_PEDLS.

Notes

  1. Optoelectronic motion capture system: https://www.vicon.com, last visited on Jul 26th, 2021.

  2. Bio-IK: https://assetstore.unity.com/packages/tools/animation/bio-ik-67819, last visited on Jul 26th, 2021.

  3. Final IK: https://assetstore.unity.com/packages/tools/animation/final-ik-14290, last visited on Jul 26th, 2021.

  4. PEDLS for MoCap Demo: https://github.com/A-Qiang/VRMoCap_PEDLS, last visited on Jul 26th, 2021.

  5. Inertial sensor motion capture system: https://www.noitom.com.cn, last visited on Jul 26th, 2021.

  6. Basic Motions: https://assetstore.unity.com/packages/3d/animations/basic-motions-free-154271, last visited on Jul 26th, 2021.

  7. CMU MoCap database: http://mocap.cs.cmu.edu, last visited on Jul 26th, 2021.

References

Download references

Acknowledgements

We appreciated Sebastian Starke for sharing the Unity3D implementation of the Bio-IK online. The data used in this project was obtained from http://mocap.cs.cmu.edu. The database was created with funding from NSF EIA-0196217.

Funding

No funding was received for conducting this study.

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed to the study conception and design. Material preparation, data collection and analysis were performed by [Qiang Zeng]. The first draft of the manuscript was written by [Qiang Zeng] and all authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Qian Liu.

Ethics declarations

Conflicts of interest

The authors have no relevant financial or non-financial interests to disclose.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Consent to participate

Not applicable.

Consent for publication

Not applicable.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (MP4 328064 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zeng, Q., Zheng, G. & Liu, Q. PE-DLS: a novel method for performing real-time full-body motion reconstruction in VR based on Vive trackers. Virtual Reality 26, 1391–1407 (2022). https://doi.org/10.1007/s10055-022-00635-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10055-022-00635-5

Keywords

Navigation