Skip to main content
Log in

A hand gesture action-based emotion recognition system by 3D image sensor information derived from Leap Motion sensors for the specific group with restlessness emotion problems

  • Technical Paper
  • Published:
Microsystem Technologies Aims and scope Submit manuscript

Abstract

The popular 3D image sensor on market urges fast developments of hand gesture-based applications. With utilizations of 3D space hand gesture data, hand gesture command classifications for control of the specific target device, such as smart speaker and smart TV equipments. However, context aware cognition such as human emotion recognition using 3D hand gesture action characteristics has been extremely rare to be explored. In this work, focused on the specific group with the restlessness emotion problem, a hand gesture action-based emotion recognition system with gesture-making user identification using the well-known Leap Motion sensor is developed. With acquiring 3D-space action variation information extracted from the gesture action made by the person, recognition of the specific human emotion categorization can then be done. In this study, there are ten different degrees of restlessness emotion behaviors are defined according to the designed restlessness emotion hand gesture categorization table that denotes restlessness characteristics of a restlessness hand gesture action, speed, repeat, attack. Three types of feature parameters derived from the Leap Motion sensor are presented, 78-dimension data with 3D hand gesture action variations, 7-dimension data with physical characteristics of the hand and 85-dimension data with combinations of 78-dimension and 7-dimension data. The presented 7-dimension feature parameter set is employed on hand gesture-making user identification, and the other two designed feature parameter sets are carried out for classifying the defined hand gesture actions with different restlessness emotion degrees. The popular K-nearest neighbor (KNN) approach is adopted to be as a gesture data classifier for performance evaluations of all these feature parameters in this study. Experiments on recognition calculations of 4 subjects in a laboratory office show that KNN with 85-dimension feature data has the averaged recognition accuracy of 80.6%, slightly superior to 80% of 78-dimension feature data. KNN with the 7-dimension feature parameter set will have the averaged performance of 66.1% on gesture-making user identification.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4:

Similar content being viewed by others

References

  • Acar E, Hopfgartner F, Albayrak S (2015) Fusion of learned multi-modal representations and dense trajectories for emotional analysis in videos. In: Proceedings of the 2015 13th international workshop on content-based multimedia indexing (CBMI), pp 1–6

  • Ameur S, Khalifa AB, Bouhlel MS (2016) A comprehensive Leap Motion database for hand gesture recognition. In: Proceedings of the international conference on sciences of electronics, technologies of information and telecommunications (SETIT), pp 514–519

  • Ayadi ME, Kamel MS, Karray F (2011) Survey on speech emotion recognition: features classification schemes and databases. Pattern Recognit 44(3):572–587

    Article  Google Scholar 

  • Boyali A, Hashimoto N, Matsumoto O (2015) Hand posture and gesture recognition using MYO armband and spectral collaborative representation based classification. In: Proceedings of the 2015 IEEE 4th global conference on consumer electronics (GCCE)

  • Calix RA, Mallepudi SA, Chen B, Knapp GM (2010) Emotion recognition in text for 3-D facial expression rendering. IEEE Trans Multimed 12(6):544–551

    Article  Google Scholar 

  • Chen H, Li J, Zhang F, Li Y, Wang H (2015a) 3D model-based continuous emotion recognition. In: Proceedings of the 2015 IEEE conference on computer vision and pattern recognition (CVPR), Boston, MA, USA

  • Chen Y, Ding Z, Chen YL (2015b) Rapid recognition of dynamic hand gestures using Leap Motion. In: Proceedings of the IEEE international conference on information and automation, pp 1419–1424

  • Churamani N, Barros P, Strahl E, Wermter S (2018) Learning empathy-driven emotion expressions using affective modulations. In: Proceedings of the 2018 international joint conference on neural networks (IJCNN), pp 1–8

  • Constantine L, Hajj H (2012) A survey of ground-truth in emotion data annotation. In: Proceedings of the 2012 IEEE international conference on pervasive computing and communications workshops, pp 697–702

  • Demircioglu B, Bulbul G, Kose H (2016) Turkish sign language recognition with Leap Motion. In: Proceedings of the 2016 24th signal processing and communication application conference, pp 589–592

  • Deng JJ, Leung CHC, Mengoni P, Li Y (2018a) Emotion recognition from human behaviors using attention model. In: Proceedings of the 2018 IEEE first international conference on artificial intelligence and knowledge engineering (AIKE), pp 249–253

  • Deng J, Leung C, Li Y (2018b) Beyond big data of human behaviors: modeling human behaviors and deep emotions. In: Proceedings of the 2018 IEEE conference on multimedia information processing and retrieval (MIPR), pp 282–286

  • Ding IJ, Chang YJ (2017) HMM with improved feature extraction-based feature parameters for identity recognition of gesture command operators by using a sensed kinect-data stream. Neurocomputing 262:108–119

    Article  Google Scholar 

  • Ding IJ, Lin SK (2017) Performance improvement of Kinect software development kit-constructed speech recognition using a client-server sensor fusion strategy for smart human–computer interface control applications. IEEE Access 5:4154–4162

    Article  Google Scholar 

  • Ding IJ, Ruan CM (2019) A study on utilization of three-dimensional sensor lip image for developing a pronunciation recognition system. J Imaging Sci Technol 63(5):50402-1–50402-9

    Article  Google Scholar 

  • Ding IJ, Shi J-Y (2017) Hybridized estimations of support vector machine free parameters C and γ using a fuzzy learning strategy for microphone array-based speaker recognition in a Kinect sensor-deployed environment. Multimed Tools Appl 76(23):25297–25319

    Article  Google Scholar 

  • Ding IJ, Wu ZG (2019) Two user adaptation-derived features for biometrical classifications of user identity in 3D-sensor-based body gesture recognition applications. IEEE Sens J 19(19):8432–8440

    Article  Google Scholar 

  • Ding IJ, Tsai CY, Yen CY (2019) A design on recommendations of sensor development platforms with different sensor modalities for making gesture biometrics-based service applications of the specific group. Microsyst Technol. https://doi.org/10.1007/s00542-019-04503-2

    Article  Google Scholar 

  • Leap Motion Controller (2013). https://www.ultraleap.com/product/leap-motion-controller/. Retrieved from March 2020

  • Lingenfelser F, Wagner J, Deng J, Brueckner R, Schuller B, André E (2018) Asynchronous and event-based fusion systems for affect recognition on naturalistic data in comparison to conventional approaches. IEEE Trans Affect Comput 9:410–423

    Article  Google Scholar 

  • Loghmani MR, Rovetta S, Venture G (2017) Emotional intelligence in robots: recognizing human emotions from daily-life gestures. In: Proceedings of the 2017 IEEE international conference on robotics and automation (ICRA), pp 1677–1684

  • Mapari RB, Kharat G (2015) Real time human pose recognition using Leap Motion sensor. In: Proceedings of the IEEE international conference on research in computational intelligence and communication networks (ICRCICN), pp 323–328

  • McCartney R, Yuan J, Bischof H-P (2015) Gesture recognition with the Leap Motion Controller. In: Proceedings of the international conference on image processing, computer vision, & pattern recognition

  • Varghese AA, Cherian JP, Kizhakkethottam JJ (2015) Overview on emotion recognition system. In: Proceedings of the 2015 IEEE international conference on soft-computing and networks security (ICSNS), Coimbatore, India

  • Wu CH, Liang WB (2011) Emotion recognition of affective speech based on multiple classifiers using acoustic-prosodic information and semantic labels. IEEE Trans Affect Comput 2(1):10–21

    Article  Google Scholar 

  • Yang B, Han X, Tang J (2017) Three class emotions recognition based on deep learning using staked autoencoder. In: Proceedings of the 2017 10th international congress on image and signal processing, biomedical engineering and informatics (CISP-BMEI), pp 1–5

  • Young SJ (2014) Photoconductive gain and noise properties of ZnO nanorods Schottky barrier photodiodes. IEEE J Sel Top Quantum Electron 20(6):96–99

    Article  Google Scholar 

  • Young SJ, Yuan KW (2019a) ZnO nanorod humidity sensor and dye-sensitized solar cells as a self-powered device. IEEE Trans Electron Devices 66(9):3978–3981

    Article  Google Scholar 

  • Young SJ, Yuan KW (2019b) Self-powered ZnO nanorod ultraviolet photodetector integrated with dye-sensitised solar cell. J Electrochem Soc 166(12):B1034–B1037

    Article  Google Scholar 

  • Zhang Z (2012) Microsoft Kinect sensor and its effect. IEEE Multimed 19(2):4–10

    Article  Google Scholar 

Download references

Acknowledgements

This research is partially supported by the Ministry of Science and Technology (MOST) in Taiwan under Grant MOST 108-2221-E-150-037.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ing-Jr Ding.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ding, IJ., Hsieh, MC. A hand gesture action-based emotion recognition system by 3D image sensor information derived from Leap Motion sensors for the specific group with restlessness emotion problems. Microsyst Technol 28, 403–415 (2022). https://doi.org/10.1007/s00542-020-04868-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00542-020-04868-9

Navigation