The Journal of Supercomputing

, Volume 75, Issue 4, pp 1909–1921 | Cite as

Genetic algorithm-based adaptive weight decision method for motion estimation framework

  • Jeongsook Chae
  • Yong Jin
  • Mingyun Wen
  • Weiqiang Zhang
  • Yunsick Sung
  • Kyungeun ChoEmail author


Recently, the diverse virtual reality devices are developed and utilized. Particularly, the devices that recognize the motions of users such as griping hands and opening hands are issued to utilize the motions of the users as input methodology. Traditional research about motion recognition suggests user’s motion estimation methods by calculating Bayesian probability after measuring the orientation of the motions by a Myo, which is one of contact-type motion recognition devices. However, the motion estimation methods have the problem of low motion estimation accuracy, given that orientation is defined by x, y, and z, which are calculated separately only considering the values of the corresponding axis. In order to improve motion estimation accuracy, motions should be estimated by considering the values of all axis. This paper proposes a method using genetic algorithm to calculate weights, which are applied to estimate motions through Bayesian probability by considering the values of all axis after measuring user’s motions with a Myo. The proposed method consists of three steps. First, the Bayesian probability is calculated by considering the correlations of x, y, and z of the orientation of a Myo. Second, weights are determined by applying genetic algorithm. Third, motions are estimated through the Bayesian probability with the determined weights. Experiments were conducted to compare the Bayesian probability between the traditional method based on min/max and the proposed method, which showed that the proposed method had reduced the difference of the orientations by 32%.


Bayesian probability Generic algorithm Motion estimation Weight 



This research was supported by BK21 Plus project of the National Research Foundation of Korea Grant and by the MSIP (Ministry of Science, ICT and Future Planning), Korea, under the ICT R&D Voucher support program (2016-0-00225(R7317-16-0011)) supervised by the IITP (Institute for Information & Communications Technology Promotion).


  1. 1.
    Abduo M, Galster M (2015) Myo gesture control armband for medical applications. University of Canterbury, ThesisGoogle Scholar
  2. 2.
    Wei W, Yunxiao A (2009) Vision-based human motion recognition: a survey. Presented at the 2nd International Conference on Intelligent Networks and Intelligent Systems (ICINIS), pp 386–389Google Scholar
  3. 3.
    Mohandes MA (2013) Recognition of two-handed Arabic signs using the CyberGlove. Arab J Sci Eng 38(3):669–677CrossRefGoogle Scholar
  4. 4.
    Suh D (2015) A study on interactive video installation based on kinect with continuous silhouette line drawings of body movements-based on the work. In: Korea Society of Image Arts and Media, vol 13, no 1, pp 119–132Google Scholar
  5. 5.
    Kim PY, Kim JW, Sung Y (2016) Bayesian probability-based hand property control method. In: Proceedings of the 3rd International Conference on Intelligent Technologies and Engineering Systems (ICITES2014), vol 345, no 33, pp 251–256Google Scholar
  6. 6.
    Lee S-G, Sung Y, Park JH (2016) Motion estimation framework and authoring tools based on MYOs and Bayesian probability. Multimed Tools Appl. Google Scholar
  7. 7.
    Schlömer T, Poppinga B, Henze N, Boll S (2008) Gesture recognition with a Wii controller. In: Proceedings of the 2nd International Conference on Tangible and Embedded Interaction, pp 11–14.
  8. 8.
    Lee S-B, Jung I-H (2014) A design and implementation of natural user interface system using kinect. J Digit Contents Soc 15(4):473–480CrossRefGoogle Scholar
  9. 9.
    Kim A-R, Rhee S-Y (2013) Mobile robot control using kinect sensor and smartphone. In: Proceedings KIIS Spring Conference, vol 23, no 1, pp 133–134Google Scholar
  10. 10.
    Strelow D, Singh S (2002) Optimal motion estimation from visual and inertial measurements. Presented at the 6th IEEE Workshop on Applications of Computer Vision (WACV 2002), pp 314–319Google Scholar
  11. 11.
    Khademi M, Hondori HM, McKenzie A, Dodakian L, Lopes CV, Cramer SC (2014) Free-hand interaction with leap motion controller for stroke rehabilitation. In: Extended Abstracts on Human Factors in Computing Systems (CHI2014), pp 1663–1668.
  12. 12.
    Potter LE, Araullo J, Carter L (2013) The leap motion controller: a view on sign language. In: Australian Conference on Human–Computer Interaction (OZCHI2013), pp 175–178Google Scholar
  13. 13.
    Weichert F, Bachmann D, Rudak B, Fisseler D (2013) Analysis of the accuracy and robustness of the leap motion controller. Sensors 13(5):6380–6393. CrossRefGoogle Scholar
  14. 14.
    Carmody T (2010) Why ‘Gorilla arm syndrome’ rules out multitouch notebook displays. Accessed 01 Sept 2017
  15. 15.
    Nymoen K, Haugen MR, Jensenius AR (2015) MuMYO—evaluating and exploring the MYO armband for musical interaction. In: International Conference on New Interfaces for Musical Expression, pp 1–4Google Scholar
  16. 16.
    Glegg SMN, Tatla SK, Holsti L (2014) The GestureTek virtual reality system in rehabilitation: a scoping review. Disabil Rehabil Assist Technol 9(2):89–111. CrossRefGoogle Scholar
  17. 17.
    Son J, Sung Y (2016) Bayesian probability and user experience-based smart UI design method. In: Proceedings of the 3rd International Conference on Intelligent Technologies and Engineering Systems (ICITES2014), vol 345, no 32, pp 245–250Google Scholar
  18. 18.
    Lee J, Chae H, Hong K (2015) A fainting condition detection system using thermal imaging cameras based object tracking algorithm. J Converg 6(3):1–15Google Scholar
  19. 19.
    Vincent J (2012) Engaging interactivity using gesture control technologies. In: Proceedings of IEEE International Conference on Games Innovation Conference (IGIC).
  20. 20.
    Majoe D, Widmer L, Tschiemer P, Gutknecht J (2009) Tai Chi Motion recognition, embedding the HMM method on a wearable. In: Proceedings of IEEE Joint Conferences on Pervasive Computing (JCPC), pp 339–344Google Scholar
  21. 21.
    Li W, Zhang Z, Liu Z (2010) Action recognition based on a bag of 3D points. In: Proceedings of IEEE International Conference on Computer Vision and Pattern Recognition Workshops, pp 9–14Google Scholar
  22. 22.
    Korea Creative Content Agency (2011) Culture technology (CT) deep report, vol 12. Brain Computer Interface (BCI) Technology TrendGoogle Scholar
  23. 23.
    Kim PY, Sung Y, Park J (2015) Bayesian probability-based motion estimation method in ubiquitous computing environments. In: Advances in Computer Science and Ubiquitous Computing, Lecture Notes in Electrical Engineering, vol 373, pp 593–598Google Scholar
  24. 24.
    Harrison C, Tan D, Morris D (2010) Skinput: appropriating the body as an input surface. In: Proceedings of SIGCHI Conference on Human Factors in Computing Systems, pp 453–462Google Scholar
  25. 25.
    Han S, Lee IY, Ahn JH (2016) Two-dimensional joint Bayesian method for face verification. J Inf Process Syst 12(3):381–391Google Scholar
  26. 26.
    Lee MR, Cho SY, Kim KT (1993) A learning method of fuzzy inference rule by genetic algorithm. Communications of the Korean Institute of Information Scientists and Engineers, vol 20, no 1Google Scholar
  27. 27.
    Sriwanna K, Boongoen T, Iam-On N (2017) Graph clustering-based discretization of splitting and merging methods (GraphS and GraphM). Hum Centric Comput Inf Sci 7:21. CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Department of Multimedia EngineeringDongguk University-SeoulSeoulRepublic of Korea

Personalised recommendations