Skip to main content

Mapping Vicon Motion Tracking to 6-Axis IMU Data for Wearable Activity Recognition

  • Chapter
  • First Online:
Activity and Behavior Computing

Part of the book series: Smart Innovation, Systems and Technologies ((SIST,volume 204))

Abstract

There are several large datasets available that are captured from motion tracking systems which could be useful to train wearable human activity recognition (HAR) systems, if only their spatial data could be mapped into the equivalent inertial measurement unit (IMU) data that would be sensed on the body. In this paper, we describe a mapping from 3D Vicon motion tracking data to data collected from a BlueSense on-body IMU. We characterise the error incurred in order to discern the extent to which it is possible to generate useful training data for a wearable activity recognition system from data collected with a motion capture system. We analyse this by mapping Vicon motion tracking data to rotational velocity and linear acceleration at the head, and compare this to actual gyroscope and accelerometer data collected by an IMU mounted on the head. In a 15 min dataset comprising three static activities—sitting, standing and lying down—we find that 95% of the reconstructed gyroscope data is within an error of [−7.25;+7.46] \(deg \cdot s^{-1}\), while 95% of the reconstructed accelerometer data was contained within [−96.1;+72.9] \(m \cdot G\). However, when we introduce more movement by including data collected while walking this increases to [−19.0;+18.2] \(deg \cdot s^{-1}\) for the gyroscope and [−208;+186] \(m \cdot G\) for the accelerometer. We conclude that generating accurate IMU data from motion capture datasets is possible and could be useful in providing larger volumes of data for activity recognition tasks and in helping enable advanced, data-hungry techniques such as deep learning to be employed on a larger scale within the domain of human activity recognition.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. D. Roggen, A. Calatroni, M. Rossi, T. Holleczek, K. Förster, G. Tröster, P. Lukowicz, D. Bannach, G. Pirkl, A. Ferscha, J. Doppler, C. Holzmann, M. Kurz, G. Holl, R. Chavarriaga, H. Sagha, H. Bayati, M. Creatura, and J. d. R. Millán, “Collecting complex activity datasets in highly rich networked sensor environments,” in 2010 Seventh International Conference on Networked Sensing Systems (INSS), June 2010, pp. 233–240

    Google Scholar 

  2. D. Micucci, M. Mobilio, and P. Napoletano, “Unimib SHAR: a new dataset for human activity recognition using acceleration data from smartphones,” Computing Research Repository, 2016

    Google Scholar 

  3. A. Reiss and D. Stricker, “Introducing a new benchmarked dataset for activity monitoring,” in 2012 16th International Symposium on Wearable Computers, June 2012, pp. 108–109

    Google Scholar 

  4. Aggarwal, J., Xia, L.: Human activity recognition from 3D data: A review. Pattern Recognition Letters 48, 70–80 (2014)

    Article  Google Scholar 

  5. A. Jalal, S. Kamal, and D. Kim, “A depth video sensor-based life-logging human activity recognition system for elderly care in smart indoor environments,” Sensors, vol. 14, no. 7, pp. 11 735–11 759, 2014

    Google Scholar 

  6. Park, S., Park, J., Al-masni, M., Al-antari, M., Uddin, M., Kim, T.-S.: A depth camera-based human activity recognition via deep learning recurrent neural network for health and social care services. Procedia Computer Science 100, 78–84 (2016)

    Article  Google Scholar 

  7. Zhu, C., Sheng, W.: Realtime recognition of complex human daily activities using human motion and location data. IEEE Transactions on Biomedical Engineering 59(9), 2422–2430 (2012)

    Article  Google Scholar 

  8. A. Kapur, A. Kapur, N. Virji-Babul, G. Tzanetakis, and P. F. Driessen, “Gesture-based affective computing on motion capture data,” in International conference on affective computing and intelligent interaction. Springer, 2005, pp. 1–7

    Google Scholar 

  9. “Vicon motion tracking system documentation,” https://docs.vicon.com/, accessed: 2020-05-23

  10. “CMU motion capture database,” http://mocap.cs.cmu.edu, accessed: 2020-02-24

  11. “MPI HDM05 motion capture database,” http://www.mpi-inf.mpg.de/resources/HDM05, accessed: 2020-02-24

  12. “CMU kitchen data set,” http://kitchen.cs.cmu.edu, accessed: 2020-02-24

  13. “TUM kitchen data set,” https://ias.in.tum.de/software/kitchen-activity-data, accessed: 2020-02-24

  14. D. Roggen, A. Pouryazdan, and M. Ciliberto, “BlueSense - designing an extensible platform for wearable motion sensing, sensor research and IoT applications,” in Proc. International Conference on Embedded Wireless Systems and Networks. ACM, 2018, pp. 177–178

    Google Scholar 

  15. V. F. Rey, P. Hevesi, O. Kovalenko, and P. Lukowicz, “Let there be IMU data: Generating training data for wearable, motion sensor based activity recognition from monocular rgb videos,” in Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers. New York, NY, USA: Association for Computing Machinery, 2019, p. 699-708

    Google Scholar 

  16. O. Banos, A. Calatroni, M. Damas, H. Pomares, I. Rojas, H. Sagha, J. del R. Millán, G. Troster, R. Chavarriaga, and D. Roggen, “Kinect=IMU? learning mimo signal mappings to automatically translate activity recognition systems across sensor modalities,” in 16th International Symposium on Wearable Computers, June 2012, pp. 92–99

    Google Scholar 

  17. Radu, V., Henne, M.: Vision2sensor: Knowledge transfer across sensing modalities for human activity recognition. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 3(3), 1–21 (2019)

    Article  Google Scholar 

  18. J. Wang, V. W. Zheng, Y. Chen, and M. Huang, “Deep transfer learning for cross-domain activity recognition,” in proceedings of the 3rd International Conference on Crowd Science and Engineering, 2018, pp. 1–8

    Google Scholar 

  19. A. D. Young, M. J. Ling, and D. K. Arvind, “IMUSim: A simulation environment for inertial sensing algorithm design and evaluation,” in Proceedings of the 10th ACM/IEEE International Conference on Information Processing in Sensor Networks, 2011, pp. 199–210

    Google Scholar 

  20. S. Takeda, T. Okita, P. Lago, and S. Inoue, “A multi-sensor setting activity recognition simulation tool,” in Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, ser. UbiComp -18.   New York, NY, USA: Association for Computing Machinery, 2018, p. 1444-1448

    Google Scholar 

  21. P. Asare, R. Dickerson, X. Wu, J. Lach, and J. Stankovic, “Bodysim: A multi-domain modeling and simulation framework for body sensor networks research and design,” 09 2013, pp. 177–180

    Google Scholar 

  22. S. Madgwick, A. Harrison, and R. Vaidyanathan, “Estimation of IMU and MARG orientation using a gradient descent algorithm,” in IEEE Int Conf on Rehabilitation Robotics, 2011

    Google Scholar 

  23. Kawsar, F., Min, C., Mathur, A., Montanari, A.: Earables for personal-scale behavior analytics. IEEE Pervasive Computing 17(3), 83–89 (2018)

    Article  Google Scholar 

  24. M. Ciliberto, L. P. Cuspinera, and D. Roggen, “Wlcsslearn: learning algorithm for template matching-based gesture recognition systems,” in International Conference on Activity and Behavior Computing, S. Inoue and A. R. Ahad, Eds., vol. 1. Institute of Electrical and Electronics Engineers, February 2019, pp. 91–96

    Google Scholar 

  25. L. Nguyen-Dinh, D. Roggen, A. Calatroni, and G. Tröster, “Improving online gesture recognition with template matching methods in accelerometer data,” in 2012 12th International Conference on Intelligent Systems Design and Applications (ISDA), Nov 2012, pp. 831–836

    Google Scholar 

Download references

Acknowledgements

This work was partially funded by the EPSRC (Brains on Board project, grant number EP/P006094/1). We also thank Nvidia for their TITAN Xp donation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lloyd Pellatt .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Pellatt, L., Dewar, A., Philippides, A., Roggen, D. (2021). Mapping Vicon Motion Tracking to 6-Axis IMU Data for Wearable Activity Recognition. In: Ahad, M.A.R., Inoue, S., Roggen, D., Fujinami, K. (eds) Activity and Behavior Computing. Smart Innovation, Systems and Technologies, vol 204. Springer, Singapore. https://doi.org/10.1007/978-981-15-8944-7_1

Download citation

Publish with us

Policies and ethics