Abstract
In this work, we achieve up to 92% classification accuracy of electromyographic data between five gestures in pseudo-real-time. Most current state-of-the-art methods in electromyographical signal processing are unable to classify real-time data in a post-learning environment, that is, after the model is trained and results are analysed. In this work we show that a process of model calibration is able to lead models from 67.87% real-time classification accuracy to 91.93%, an increase of 24.06%. We also show that an ensemble of classical machine learning models can outperform a Deep Neural Network. An original dataset of EMG data is collected from 15 subjects for 4 gestures (Open-Fingers, Wave-Out, Wave-in, Close-fist) using a Myo Armband for measurement of forearm muscle activity. The dataset is cleaned between gesture performances on a per-subject basis and a sliding temporal window algorithm is used to perform statistical analysis of EMG signals and extract meaningful mathematical features as input to the learning paradigms. The classifiers used in this paper include a Random Forest, a Support Vector Machine, a Multilayer Perceptron, and a Deep Neural Network. The three classical classifiers are combined into a single model through an ensemble voting system which scores 91.93% compared to the Deep Neural Network which achieves a performance of 88.68%, both after calibrating to a subject and performing real-time classification (pre-calibration scores for the two being 67.87% and 74.27%, respectively).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Dataset available at https://www.kaggle.com/chrisdolopikos/eleectromyography-dataset.
References
Abreu, J.G., Teixeira, J.M., Figueiredo, L.S., Teichrieb, V.: Evaluating sign language recognition using the myo armband. In: 2016 XVIII Symposium on Virtual and Augmented Reality (SVR), pp. 64–70 (2016)
Aleem, I.S., Ataee, P., Lake, S.: Systems, devices, and methods for wearable electronic devices as state machines. US Patent 10,199,008, 5 February 2019 (2019)
Benalcázar, M.E., Motoche, C., Zea, J.A., Jaramillo, A.G., Anchundia, C.E., Zambrano, P., Segura, M., Palacios, F.B., Pérez, M.: Real-time hand gesture recognition using the Myo armband and muscle activity detection. In: 2017 IEEE Second Ecuador Technical Chapters Meeting (ETCM), pp. 1–6 (2017)
Bird, J.J., Kobylarz, J., Faria, D.R., Ekárt, A., Ribeiro, E.P.: Cross-domain MLP and CNN transfer learning for biological signal processing: EEG and EMG. IEEE Access 8, 54789–54801 (2020)
Bird, J.J., Manso, L.J., Ribeiro, E.P., Ekart, A., Faria, D.R.: A study on mental state classification using EEG-based brain-machine interface. In: 2018 International Conference on Intelligent Systems (IS), Funchal, Madeira Island, Portugal, September 2018 (2018)
Boyali, A., Hashimoto, N., Matsumoto, O.: Hand posture and gesture recognition using MYO armband and spectral collaborative representation based classification. In: 2015 IEEE 4th Global Conference on Consumer Electronics (GCCE), pp. 200–201 (2015)
British Deaf Association: Help & resources, March 2017 (2017)
Chollet, F., et al.: Keras. https://keras.io (2015)
Cummings, S.W., Tangen, C., Wood, B., Crompton, R.H.: Encyclopædia Britannica, April 2018 (2018)
Darwin, C.: On the Origin of Species: A Facsimile of the First Edition. Harvard University Press, Cambridge (1964)
Deja, J.A., Arceo, P., David, D.G., Gan, P.L., Roque, R.C.: MyoSL: a framework for measuring usability of two-arm gestural electromyography for sign language. In: Antona, M., Stephanidis, C., (eds.) Universal Access in Human-Computer Interaction. Methods, Technologies, and Users, pp. 146–159. Springer, Cham (2018)
Dolopikos, C., Pritchard, M., Bird, J.J., Faria, D.R.: Collection of original EMG dataset (2020)
Ghebreyesus, T.A.: WHO director-general’s opening remarks at the media briefing on COVID-19, 11 March 2020 (2020)
Gray, H.: Anatomy of the Human Body, 20th edn. Lea and Febiger, Philadelphia (1918)
Kaya, E., Kumbasar, T.: Hand gesture recognition systems with the wearable Myo armband. In: 2018 6th International Conference on Control Engineering Information Technology (CEIT), 10 2018 (2018)
Kim, J., Mastnik, S., André, E.: EMG-based hand gesture recognition for realtime biosignal interfacing. In: Proceedings of the 13th International Conference on Intelligent User Interfaces, pp. 30–39 (2008)
Kobylarz, J., Bird, J.J., Faria, D.R., Ribeiro, E.P., Ekárt, A.: Thumbs up, thumbs down: non-verbal human-robot interaction through real-time EMG classification via inductive and supervised transductive transfer learning. J. Ambient Intell. Hum. Comput. 11, 6021–6031 (2020)
Masson, S., Fortuna, F., Moura, F., Soriano, D.: Integrating Myo armband for the control of myoelectric upper limb prosthesis. In: XXV Brazillian Congress on Biomedial Engineering, 10 2016 (2016)
The Mathworks, Inc.: MATLAB version 9.7.0.1296695 (R2019b) Update 4, Natick, Massachusetts, USA (2019)
Merletti, R., Hermens, H.J.: Detection and conditioning of the surface EMG signal. Electromyography: Physiology, Engineering, and Noninvasive Applications, pp. 107–31 (2004)
Merletti, R., Parker, P.J.: Electromyography: Physiology, Engineering, and Non-invasive Applications, vol. 11. Wiley, Hoboken (2004)
Morais, G.D., Neves, L.C., Masiero, A.A., de Castro, M.C.F.: Application of Myo armband system to control a robot interface. In: Proceedings of the 9th International Joint Conference on Biomedical Engineering Systems and Technologies, pp. 227–231 (2016)
Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)
Platt, J., et al.: Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods. Adv. Large Margin Classif. 10(3), 61–74 (1999)
Thalmic Labs, Inc.: How do i access the raw EMG data from the Myo armband? (2015)
Thalmic Labs, Inc.: Tech specs: Myo battery life, dimensions, compatibility, and more. Web Archive (2016)
Victorino, M.N., Jiang, X., Menon, C.: Wearable Technologies and Force Myography for Healthcare, pp. 135–152. Academic Press, Cambridge (2018)
Xin Yan and Xiao Gang Su: Stratified Wilson and Newcombe confidence intervals for multiple binomial proportions. Stat. Biopharm. Res. 2(3), 329–335 (2010)
Zheng, O.K., Cheng, C.: Interactive lighting performance system with Myo gesture control armband. In: 2018 1st IEEE International Conference on Knowledge Innovation and Invention (ICKII), pp. 381–383 (2018)
Acknowledgment
This work is partially supported by EPSRC-UK InDex project (EU CHIST-ERA programme), with reference EP/S032355/1 and by the Royal Society (UK) through the project “Sim2Real” with grant number RGS\(\backslash \)R2\(\backslash \)192498.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Dolopikos, C., Pritchard, M., Bird, J.J., Faria, D.R. (2021). Electromyography Signal-Based Gesture Recognition for Human-Machine Interaction in Real-Time Through Model Calibration. In: Arai, K. (eds) Advances in Information and Communication. FICC 2021. Advances in Intelligent Systems and Computing, vol 1364. Springer, Cham. https://doi.org/10.1007/978-3-030-73103-8_65
Download citation
DOI: https://doi.org/10.1007/978-3-030-73103-8_65
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-73102-1
Online ISBN: 978-3-030-73103-8
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)