Skip to main content

Electromyography Signal-Based Gesture Recognition for Human-Machine Interaction in Real-Time Through Model Calibration

Part of the Advances in Intelligent Systems and Computing book series (AISC,volume 1364)

Abstract

In this work, we achieve up to 92% classification accuracy of electromyographic data between five gestures in pseudo-real-time. Most current state-of-the-art methods in electromyographical signal processing are unable to classify real-time data in a post-learning environment, that is, after the model is trained and results are analysed. In this work we show that a process of model calibration is able to lead models from 67.87% real-time classification accuracy to 91.93%, an increase of 24.06%. We also show that an ensemble of classical machine learning models can outperform a Deep Neural Network. An original dataset of EMG data is collected from 15 subjects for 4 gestures (Open-Fingers, Wave-Out, Wave-in, Close-fist) using a Myo Armband for measurement of forearm muscle activity. The dataset is cleaned between gesture performances on a per-subject basis and a sliding temporal window algorithm is used to perform statistical analysis of EMG signals and extract meaningful mathematical features as input to the learning paradigms. The classifiers used in this paper include a Random Forest, a Support Vector Machine, a Multilayer Perceptron, and a Deep Neural Network. The three classical classifiers are combined into a single model through an ensemble voting system which scores 91.93% compared to the Deep Neural Network which achieves a performance of 88.68%, both after calibrating to a subject and performing real-time classification (pre-calibration scores for the two being 67.87% and 74.27%, respectively).

Keywords

  • Real-time gesture classification
  • Machine learning
  • Deep learning
  • Biosignal processing
  • EMG

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-030-73103-8_65
  • Chapter length: 17 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   219.00
Price excludes VAT (USA)
  • ISBN: 978-3-030-73103-8
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   279.99
Price excludes VAT (USA)
Fig. 1.
Fig. 2.

Notes

  1. 1.

    Dataset available at https://www.kaggle.com/chrisdolopikos/eleectromyography-dataset.

References

  1. Abreu, J.G., Teixeira, J.M., Figueiredo, L.S., Teichrieb, V.: Evaluating sign language recognition using the myo armband. In: 2016 XVIII Symposium on Virtual and Augmented Reality (SVR), pp. 64–70 (2016)

    Google Scholar 

  2. Aleem, I.S., Ataee, P., Lake, S.: Systems, devices, and methods for wearable electronic devices as state machines. US Patent 10,199,008, 5 February 2019 (2019)

    Google Scholar 

  3. Benalcázar, M.E., Motoche, C., Zea, J.A., Jaramillo, A.G., Anchundia, C.E., Zambrano, P., Segura, M., Palacios, F.B., Pérez, M.: Real-time hand gesture recognition using the Myo armband and muscle activity detection. In: 2017 IEEE Second Ecuador Technical Chapters Meeting (ETCM), pp. 1–6 (2017)

    Google Scholar 

  4. Bird, J.J., Kobylarz, J., Faria, D.R., Ekárt, A., Ribeiro, E.P.: Cross-domain MLP and CNN transfer learning for biological signal processing: EEG and EMG. IEEE Access 8, 54789–54801 (2020)

    CrossRef  Google Scholar 

  5. Bird, J.J., Manso, L.J., Ribeiro, E.P., Ekart, A., Faria, D.R.: A study on mental state classification using EEG-based brain-machine interface. In: 2018 International Conference on Intelligent Systems (IS), Funchal, Madeira Island, Portugal, September 2018 (2018)

    Google Scholar 

  6. Boyali, A., Hashimoto, N., Matsumoto, O.: Hand posture and gesture recognition using MYO armband and spectral collaborative representation based classification. In: 2015 IEEE 4th Global Conference on Consumer Electronics (GCCE), pp. 200–201 (2015)

    Google Scholar 

  7. British Deaf Association: Help & resources, March 2017 (2017)

    Google Scholar 

  8. Chollet, F., et al.: Keras. https://keras.io (2015)

  9. Cummings, S.W., Tangen, C., Wood, B., Crompton, R.H.: Encyclopædia Britannica, April 2018 (2018)

    Google Scholar 

  10. Darwin, C.: On the Origin of Species: A Facsimile of the First Edition. Harvard University Press, Cambridge (1964)

    CrossRef  Google Scholar 

  11. Deja, J.A., Arceo, P., David, D.G., Gan, P.L., Roque, R.C.: MyoSL: a framework for measuring usability of two-arm gestural electromyography for sign language. In: Antona, M., Stephanidis, C., (eds.) Universal Access in Human-Computer Interaction. Methods, Technologies, and Users, pp. 146–159. Springer, Cham (2018)

    Google Scholar 

  12. Dolopikos, C., Pritchard, M., Bird, J.J., Faria, D.R.: Collection of original EMG dataset (2020)

    Google Scholar 

  13. Ghebreyesus, T.A.: WHO director-general’s opening remarks at the media briefing on COVID-19, 11 March 2020 (2020)

    Google Scholar 

  14. Gray, H.: Anatomy of the Human Body, 20th edn. Lea and Febiger, Philadelphia (1918)

    Google Scholar 

  15. Kaya, E., Kumbasar, T.: Hand gesture recognition systems with the wearable Myo armband. In: 2018 6th International Conference on Control Engineering Information Technology (CEIT), 10 2018 (2018)

    Google Scholar 

  16. Kim, J., Mastnik, S., André, E.: EMG-based hand gesture recognition for realtime biosignal interfacing. In: Proceedings of the 13th International Conference on Intelligent User Interfaces, pp. 30–39 (2008)

    Google Scholar 

  17. Kobylarz, J., Bird, J.J., Faria, D.R., Ribeiro, E.P., Ekárt, A.: Thumbs up, thumbs down: non-verbal human-robot interaction through real-time EMG classification via inductive and supervised transductive transfer learning. J. Ambient Intell. Hum. Comput. 11, 6021–6031 (2020)

    CrossRef  Google Scholar 

  18. Masson, S., Fortuna, F., Moura, F., Soriano, D.: Integrating Myo armband for the control of myoelectric upper limb prosthesis. In: XXV Brazillian Congress on Biomedial Engineering, 10 2016 (2016)

    Google Scholar 

  19. The Mathworks, Inc.: MATLAB version 9.7.0.1296695 (R2019b) Update 4, Natick, Massachusetts, USA (2019)

    Google Scholar 

  20. Merletti, R., Hermens, H.J.: Detection and conditioning of the surface EMG signal. Electromyography: Physiology, Engineering, and Noninvasive Applications, pp. 107–31 (2004)

    Google Scholar 

  21. Merletti, R., Parker, P.J.: Electromyography: Physiology, Engineering, and Non-invasive Applications, vol. 11. Wiley, Hoboken (2004)

    CrossRef  Google Scholar 

  22. Morais, G.D., Neves, L.C., Masiero, A.A., de Castro, M.C.F.: Application of Myo armband system to control a robot interface. In: Proceedings of the 9th International Joint Conference on Biomedical Engineering Systems and Technologies, pp. 227–231 (2016)

    Google Scholar 

  23. Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)

    MathSciNet  MATH  Google Scholar 

  24. Platt, J., et al.: Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods. Adv. Large Margin Classif. 10(3), 61–74 (1999)

    Google Scholar 

  25. Thalmic Labs, Inc.: How do i access the raw EMG data from the Myo armband? (2015)

    Google Scholar 

  26. Thalmic Labs, Inc.: Tech specs: Myo battery life, dimensions, compatibility, and more. Web Archive (2016)

    Google Scholar 

  27. Victorino, M.N., Jiang, X., Menon, C.: Wearable Technologies and Force Myography for Healthcare, pp. 135–152. Academic Press, Cambridge (2018)

    Google Scholar 

  28. Xin Yan and Xiao Gang Su: Stratified Wilson and Newcombe confidence intervals for multiple binomial proportions. Stat. Biopharm. Res. 2(3), 329–335 (2010)

    CrossRef  Google Scholar 

  29. Zheng, O.K., Cheng, C.: Interactive lighting performance system with Myo gesture control armband. In: 2018 1st IEEE International Conference on Knowledge Innovation and Invention (ICKII), pp. 381–383 (2018)

    Google Scholar 

Download references

Acknowledgment

This work is partially supported by EPSRC-UK InDex project (EU CHIST-ERA programme), with reference EP/S032355/1 and by the Royal Society (UK) through the project “Sim2Real” with grant number RGS\(\backslash \)R2\(\backslash \)192498.

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Verify currency and authenticity via CrossMark

Cite this paper

Dolopikos, C., Pritchard, M., Bird, J.J., Faria, D.R. (2021). Electromyography Signal-Based Gesture Recognition for Human-Machine Interaction in Real-Time Through Model Calibration. In: Arai, K. (eds) Advances in Information and Communication. FICC 2021. Advances in Intelligent Systems and Computing, vol 1364. Springer, Cham. https://doi.org/10.1007/978-3-030-73103-8_65

Download citation