Development and evaluation of a self-training system for tennis shots with motion feature assessment and visualization

Abstract

In this paper, we propose a prototype of a self-training system for tennis forehand shots that allows trainees to practice their motion forms by themselves. Our system includes a motion capture device to record the trainee’s motion, and the system visualizes the differences between the features of the trainee’s motion and the correct motion performed by an expert. The system enables trainees to understand the errors in their motion and how to reduce or eliminate them. In this study, we classify the motion features and corresponding visualization methods based on the one-dimensional spatial, rotational, and temporal features of key poses. We also develop a statistical model for the motion features so that the system can assess and prioritize all features of a trainee’s motion. Related features are simultaneously visualized by analyzing their correlations. We describe the process of defining the motion features for the tennis forehand shot of an expert. We evaluated our prototype through several user experiments and demonstrated its feasibility as a self-training system.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

References

  1. 1.

    Anderson, F., Grossman, T., Matejka, J., Fitzmaurice, G.: Youmove: Enhancing movement training with an augmented reality mirror. In: ACM Symposium on User Interface Software and Technology (UIST), vol. 2013, pp. 311–320 (2013)

  2. 2.

    Assa, J., Caspi, Y., Cohen-Or, D.: Action synopsis: pose selection and illustration. ACM Trans. Gr. 24(3), 667–676 (2005)

    Article  Google Scholar 

  3. 3.

    Baek, S., Lee, S., Kim, G.J.: Motion retargeting and evaluation for vr-based training of free motions. Vis. Comput. 19(4), 222–242 (2003)

    Article  Google Scholar 

  4. 4.

    Chan, J.C., Leung, H., Tang, J.K., Komura, T.: A virtual reality dance training system using motion capture technology. IEEE Trans. Learn. Technol. 4(2), 187–195 (2011)

    Article  Google Scholar 

  5. 5.

    Gleicher, M.: Retargetting motion to new characters. In: Proceedings SIGGRAPH ’98, pp. 33–42 (1998)

  6. 6.

    Guay, M., Cani, M.P., Ronfard, R.: The line of action: an intuitive interface for expressive character posing. ACM Trans. Gr. 32, 205 (2013)

  7. 7.

    Khan, A., Mellory, S., Berlin, E., Thompsony, R., McNaneyy, R., Oliviery, P., Plötz, T.: Beyond activity recognition: skill assessment from accelerometer data. In: ACM International Joint Conference on Pervasive and Ubiquitous Computing (UBICOMP) 2015, pp. 1155–1166 (2015)

  8. 8.

    Kyan, M., Sun, G., Li, H., Zhong, L., Muneesawang, P., Dong, N., Elder, B., Guan, L.: An approach to ballet dance training through MS kinect and visualization in a cave virtual reality environment. ACM Trans. Intell. Syst. Technol. 6, 23 (2015)

  9. 9.

    Laraba, S., Tilmanne, J.: Dance performance evaluation using hidden Markov models. Comput. Anim. Virtual Worlds 27(3–4), 321–329 (2016)

    Article  Google Scholar 

  10. 10.

    Lau, M., Bar-Joseph, Z., Kuffner, J.: Modeling spatial and temporal variation in motion data. ACM Trans. Gr. 28, 171 (2009)

  11. 11.

    Min, J., Chen, Y.L., Chai, J.: Interactive generation of human animation with deformable motion models. ACM Trans. Gr. 29, 9 (2010)

  12. 12.

    Natural Point: Optitrack motive (2013). http://www.naturalpoint.com/. Accessed 14 Dec 2018

  13. 13.

    Neff, M., Kim, Y.: Interactive editing of motion style using drives and correlations. In: Eurographics/ ACM SIGGRAPH Symposium on Computer Animation, vol. 2009, pp. 103–112 (2009)

  14. 14.

    Oshita, M., Inao, T., Mukai, T., Kuriyama, S.: Self-training system for tennis shots with motion feature assessment and visualization. In: International Conference on Cyberworlds, 2018, pp. 82–89 (2018)

  15. 15.

    Shapiro, S.S., Wilk, M.B.: An analysis of variance test for normality (complete samples). Biometrika 52(3/4), 591–611 (1965)

    MathSciNet  Article  Google Scholar 

  16. 16.

    Sok, K.W., Yamane, K., Lee, J., Hodgins, J.: Editing dynamic human motions via momentum and force. In: Eurographics/ACM SIGGRAPH Symposium on Computer Animation, vol. 2010, pp. 11–20 (2010)

  17. 17.

    Tang, R., Yang, X.D., Tang, A., Bateman, S., Jorge, J.: Physio@home: Exploring visual guidance and feedback techniques for physiotherapy patients at home. In: SIGCHI Conference on Human Factors in Computing Systems (CHI), 2015, pp. 4123–4132 (2015)

  18. 18.

    Tao, L., Paiement, A., Damen, D., Mirmehdi, M., Hannuna, S., Camplani, M., Burghardt, T., Craddock, I.: A comparative study of pose representation and dynamics modelling for online motion quality assessment. Computer Vis. Image Underst. 148, 136–152 (2017)

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported in part by a Grant-in-Aid for Scientific Research (No. 15H02704) from the Japan Society for the Promotion of Science.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Masaki Oshita.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Oshita, M., Inao, T., Ineno, S. et al. Development and evaluation of a self-training system for tennis shots with motion feature assessment and visualization. Vis Comput 35, 1517–1529 (2019). https://doi.org/10.1007/s00371-019-01662-1

Download citation

Keywords

  • Training system
  • Sports form
  • Motion feature
  • Visualization
  • Motion capture