Skip to main content

Advertisement

Log in

Dynamic time warping–based feature selection method for foot gesture cobot operation mode selection

  • ORIGINAL ARTICLE
  • Published:
The International Journal of Advanced Manufacturing Technology Aims and scope Submit manuscript

Abstract   

The emerging needs of human beings are pushing manufacturing companies from mass production to mass customization. The occurrence of these new challenges leads to a change of scenario where the robot no longer works isolated from human to a scenario in which the robot collaborates with the human in the same workspace (collaborative robotics). Wearable sensors using inertial measurement unit (IMU) are widely used to capture human upper body gestures in which the set of gesture being recognize is very large. However, foot gesture approach is starting to gain some places in applications where human’s hands are occupied when interacting with robots. This study presents an insole-based foot gesture recognition method for cobot operation mode selection. The insole is composed of an IMU and four force sensors. The classification algorithm uses a support vector machine (SVM) classifier based on features extracted by means of dynamic time warping (DTW) applied to only one reference gesture signal. Five human participants are used for the dataset. As a case study, the system was interfaced in real time (real-time classification algorithm) using a Simulink 2020a scheme with Universal Robots UR5 (5 kg payload). The worst-case recognition accuracy is around 88%. The algorithm is able to adequately discriminate between 10-foot gestures by means of a wearable insole sensor incorporated into the insole. Moreover, this study shows that, the control gesture can accurately be recognized from other current activities such as walking, turning, climbing the stairs, and similar.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References  

  1. Krüger J, Lien TK, Verl A (2009) Cooperation of human and machines in assembly lines. CIRP Ann 58(2):628–646

    Article  Google Scholar 

  2. Lopes M et al (2015) Semi-Autonomous 3rd-Hand Robot. Robot. Future Manuf. Scenar, vol. 3

  3. Safeea M, Neto P, Bearee R (2019) On-line collision avoidance for collaborative robot manipulators by adjusting off-line generated paths: an industrial use case. Robot Auton Syst 119:278–288

    Article  Google Scholar 

  4. Neto P et al (2019) Gesture-based human-robot interaction for human assistance in manufacturing. Int J Adv Manuf Technol 101(1):119–135

    Article  Google Scholar 

  5. Ende T et al (2011) A human-centered approach to robot gesture based communication within collaborative working processes. In: 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, p 3367–3374

  6. Jiang W et al (2021) Wearable on-device deep learning system for hand gesture recognition based on FPGA accelerator. Math Biosci Eng 18(1):132–153

    Article  MathSciNet  MATH  Google Scholar 

  7. Juang J-G, Tsai Y-J, Fan Y-W (2015) Visual recognition and its application to robot arm control. Appl Sci 5(4):851–880

    Article  Google Scholar 

  8. Aswad FE et al (2021) Image generation for 2D-CNN using time-series signal features from foot gesture applied to select cobot operating mode. Sensors 21(17):5743

    Article  Google Scholar 

  9. Crossan A, Brewster S, Ng A (2010) Foot tapping for mobile interaction. Proceedings of HCI 2010(24):418–422

    Google Scholar 

  10. Hua R, Wang Y (2020) A customized convolutional neural network model integrated with acceleration-based smart insole toward personalized foot gesture recognition. IEEE Sensors Letters 4(4):1–4

    Article  MathSciNet  Google Scholar 

  11. Valkov D et al (2010) Traveling in 3d virtual environments with foot gestures and a multi-touch enabled wim. In: Proceedings of virtual reality international conference (VRIC 2010). p. 171–180

  12. Gudmundsson, Steinn, Runarsson, Thomas Philip, Sigurdsson, Sven, (2008) Support vector machines and dynamic time warping for time series. In: 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence). IEEE, p. 2772–2776

  13. Kate RJ (2016) Using dynamic time warping distances as features for improved time series classification. Data Min Knowl Disc 30(2):283–312

    Article  MathSciNet  MATH  Google Scholar 

  14. Li W, Shi P, Yu H (2021) Gesture recognition using surface electromyography and deep learning for prostheses hand: state-of-the-art, challenges, and future. Frontiers in neuroscience 15:621885

    Article  Google Scholar 

  15. Fan M et al (2017) An empirical study of foot gestures for hands-occupied mobile interaction. In : Proceedings of the 2017 ACM International Symposium on Wearable Computers. p 172–173

  16. Sasaki T et al (2017) MetaLimbs: multiple arms interaction metamorphism. In: ACM SIGGRAPH 2017 Emerging Technologies. p 1–2

  17. Kim T et al (2019) Usability of foot-based interaction techniques for mobile solutions. Mobile Solutions and Their Usefulness in Everyday Life. Springer, pp 309–329

    Chapter  Google Scholar 

  18. Maragliulo S et al (2019) Foot gesture recognition through dual channel wearable EMG system. IEEE Sens J 19(22):10187–10197

    Article  Google Scholar 

  19. Huang Y et al (2021) Design and evaluation of a foot-controlled robotic system for endoscopic surgery. IEEE Robot Autom Lett 6(2):2469–2476

    Article  Google Scholar 

  20. Asghar A et al (2022) Review on electromyography based intention for upper limb control using pattern recognition for human-machine interaction. Proc Inst Mech Eng [H] 236(5):628–645

    Article  Google Scholar 

  21. Kiranyaz S et al (2019) 1-D convolutional neural networks for signal processing applications. In: ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, p 8360–8364

  22. Ismail Fawaz H et al (2019) Deep learning for time series classification: a review. Data Min Knowl Disc 33(4):917–963

    Article  MathSciNet  MATH  Google Scholar 

  23. Iwana BK, Uchida S (2020) Time series classification using local distance-based features in multi-modal fusion networks. Pattern Recogn 97:107024

    Article  Google Scholar 

  24. Datasheet ESP32. Available from: https://www.espressif.com/sites/default/files/documentation/esp32_datasheet_en.pdf. Accessed 03 Nov 2021

  25. Barkallah E et al (2017) Wearable devices for classification of inadequate posture at work using neural networks. Sensors 17(9):2003

    Article  Google Scholar 

  26. Datasheet Mpu9250. Available from: https://www.invensense.com/wp-content/uploads/2015/02/PS-MPU-9250A-01-v1.1.pdf. Accessed 03 Feb 2017

  27. Wu C et al (2020) sEMG measurement position and feature optimization strategy for gesture recognition based on ANOVA and neural networks. IEEE Access 8:56290–56299

    Article  Google Scholar 

  28. Tchane Djogdom, Gilde Vanel; Meziane, Ramy; Otis, Martin, (2022) Insole sensor data for foot gestures. https://doi.org/10.5683/SP3/C4UQCW, Borealis, V1

  29. Lin, Chengyu, Tang, Yuxuan, Zhou, Yong, et al (2021) Foot gesture recognition with flexible high-density device based on convolutional neural network. In : 2021 6th IEEE International Conference on Advanced Robotics and Mechatronics (ICARM). IEEE, p 306-311

  30. Lyons KR, Joshi SS (2018) Upper limb prosthesis control for high-level amputees via myoelectric recognition of leg gestures. IEEE Transactions on Neural Systems and Rehabilitation Engineering 26(5):1056–1066

    Article  Google Scholar 

  31. Chawuthai R, Sakdanuphab R (2018) The analysis of a microwave sensor signal for detecting a kick gesture. In : 2018 International Conference on Engineering, Applied Sciences, and Technology (ICEAST). IEEE, 2018. p 1–4

Download references

Funding

This work received financial support from the Fonds de recherche du Québec—Nature et technologies (FRQNT), under grant number 2020-CO-275043 (Ramy Meziane) and NSERC Discovery grant number RGPIN-2018–06329 (Martin Otis). This project uses the infrastructure obtained by the Ministère de l’Économie et de l’Innovation (MEI) du Quebec, John R. Evans Leaders Fund of the Canadian Foundation for Innovation (CFI) and the Infrastructure Operating Fund (FEI) under the project number 35395.

Author information

Authors and Affiliations

Authors

Contributions

Conceptualization, G.V.T.D. and M.O.; methodology, G.V.T.D, and M.O.; software, G.V.T.D; validation, G.V.T.D.; formal analysis, G.V.T.D.; investigation, G.V.T.D.; resources, M.O.; data curation, G.V.T.D.; writing—original draft preparation, G.V.T.D.; writing—review and editing, G.V.T.D., M.O.; visualization, G.V.T.D. and M.O.; supervision, M.O, R.M..; project administration, M.O.; funding acquisition, M.O. and R.M. All authors have read and agreed to the published version of the manuscript.

Corresponding author

Correspondence to Gilde Vanel Tchane Djogdom.

Ethics declarations

Conflict of interest

The authors declare no competing interests.

Additional information

Publisher's note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Tchane Djogdom, G.V., Otis, M.JD. & Meziane, R. Dynamic time warping–based feature selection method for foot gesture cobot operation mode selection. Int J Adv Manuf Technol 126, 4521–4541 (2023). https://doi.org/10.1007/s00170-023-11280-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00170-023-11280-w

Keywords

Navigation