Skip to main content

Face Emotion Detection forĀ Autism Children Using Convolutional Neural Network Algorithms

  • Chapter
  • First Online:
Artificial Intelligence for Societal Issues

Part of the book series: Intelligent Systems Reference Library ((ISRL,volume 231))

Abstract

Over the past few decades, emotion detection in a real-time environment has been a progressive area of research. This study aims to identify physically challenged individuals and the cognitive gestures of Autism children using Convolutional Neural Network (CNN) based on facial landmarks by creating the emotion detection algorithm. The algorithm is used in real-time by using virtual markers that perform efficiently in the uneven spotlight and head rotation, multiple backgrounds and distinct facial tones. Emotions of faces such as happiness, sorrow, rage, Surprise, Disgust, Fear using virtual markers are collected. Initially, the faces are detected using a cascade classifier. After face image detection, the image is given to the pre-processing stage to remove the noise present in the image. This process is used to increase classification accuracy. Finally, the images are given to the convolution neural network (CNN) classifier to classify the emotions such as happiness, sorrow, rage, Surprise, Disgust and Fear. The performance of the proposed approach is analyzed based on accuracy, precision, recall and F-measure. The proposed Emotion detection using CNN has achieved a cumulative recognition rate of 99.81%.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Dada, E.G., Bassi, J.S., Chiroma, H., Abdulhamid, S.M., Adetunmbi, A.O., Ajibuwa, O.E.: Machine learning for email spam filtering: review, approaches and open research problems. Heliyon 5(6), e01802 (2019)

    ArticleĀ  Google ScholarĀ 

  2. Xie, M.: Development of artificial intelligence and effects on financial system. J. Phys. Conf. 1187, 032084 (2019)

    ArticleĀ  Google ScholarĀ 

  3. Hegazy, O., Soliman, O.S., Salam, M.A.: A machine learning model for stock market prediction. Int. J. Comput. Sci. Telecommun. 4(12), 16ā€“23 (2014)

    Google ScholarĀ 

  4. Beckmann, J.S., Lew, D.: Reconciling evidence-based medicine and precision medicinein the era of big data: challenges and opportunities. Genome Med. 8(1), 134ā€“139 (2016)

    ArticleĀ  Google ScholarĀ 

  5. Weber, G.M., Mandl, K.D., Kohane, I.S.: Finding the missing link for big biomedical data. Jama 311(24), 2479ā€“2480 (2014)

    Google ScholarĀ 

  6. Loconsole, C., Chiaradia, D., Bevilacqua, V., Frisoli, A.: Real-time emotion recogni- tion: an improved hybrid approach for classification performance. Intell. Comput. Theory 320ā€“331 (2014)

    Google ScholarĀ 

  7. Huang, X., Kortelainen, J., Zhao, G., Li, X., Moilanen, A., SeppanenĀ ā‚¬,Ā T., PietikĀ ā‚¬ ainen, M.: Multi-modal emotion analysis from facial expressions and electro encephalo gram. Comput. Vis. Image Understand 147, 114ā€“124 (2016). https://doi.org/10.1016/j.cviu.2015.09.015

  8. Raheel, A., Majid, M., Anwar, S.M.: Facial expression recognition based on electroen- cephalography. In: 2nd International Conference on Computing, Mathematics and Engineering Technologies (iCoMET), Sukkur. Pakistan 1ā€“5 (2019)

    Google ScholarĀ 

  9. Vassilis, S., Herrmann J.: Where do machine learning and human-co mputer interaction meet? (1997)

    Google ScholarĀ 

  10. Keltiner, D., Ekrman, P., Lewis, M., Haviland Jones, J.M. (eds.) Facial Expression of Emotion, Hand Book of Emotions, pp. 236ā€“49. Gilford Press, New York (2000)

    Google ScholarĀ 

  11. Ekman, P.: Darwin and Facial Expression: A Century of Research in Review, p. 1973. Academic Press Ishk, United State Of America (2006)

    Google ScholarĀ 

  12. Ekman, P., Friesen, W.V.: Constants across cultures in the face and emotion. J. Pers. Soc. Psychol. 17(2), 124 (1971)

    ArticleĀ  Google ScholarĀ 

  13. Ekman, P.: Darwin and facial expression: a century of research in review, p. 1973. Academic Press Ishk, United State of America (2006)

    Google ScholarĀ 

  14. Ekman, P., Friesen, W.V., Ancoli, S.: Facial signs of emotional experience. J. Pers. Soc. Psychol. 39, 1123ā€“1134 (1980)

    ArticleĀ  Google ScholarĀ 

  15. Ekman, P., Friesen, W.V., Ancoli, S.: Facial signs of emotional experience. J. Pers. Soc. Psychol. 39, 1123ā€“34 (1980)

    ArticleĀ  Google ScholarĀ 

  16. Nguyen, B.T., Trinh, M.H., Phan, T.V., Nguyen, H.D.: An efficient real-time emotion detection using camera and facial landmarks. In: 2017 Seventh International Conference on Information Science and Technology (ICIST) (2017). https://doi.org/ 10.1109/icist.2017.7926765

  17. Loconsole, C., Miranda, C.R., Augusto, G., Frisoli, A., Orvalho, V.: Real-time emotion recognition novel method for geometrical facial features extraction. In: Proceedings of the International Conference on Computer Vision Theory and Applications (VISAPP), pp. 378ā€“385 (2014)

    Google ScholarĀ 

  18. Palestra, G., Pettinicchio, A., Coco, M.D., Carcagn, P., Leo, M., Distante, C.: Improved performance in facial expression recognition using 32 geometric features. In: Proceedings of the 18th International Conference on Image Analysis and Processing. ICIAP, pp. 518ā€“528 (2015)

    Google ScholarĀ 

  19. Zhang, J., Yin, Z., Cheng, P., Nichele, S.: Emotion recognition using multi-modal data and machine learning techniques: a tutorial and review. Inf, Fusion (2020)

    Google ScholarĀ 

  20. Patil, P., Kumar, K.S., Gaud, N., Semwal, V.B.: Clinical human gait classification: extreme learning machine approach. In: 2019 1st International Conference on Advances in Science, Engineering and Robotics Technology (ICASERT), pp. 1-6. IEEE (2019)

    Google ScholarĀ 

  21. Raj, M., Semwal, V.B., Nandi, G.C.: Bidirectional association of joint angle trajectories for humanoid locomotion: the restricted Boltzmann machine approach. Neural Comput. Appl. 30(6), 1747ā€“1755 (2018)

    ArticleĀ  Google ScholarĀ 

  22. Jain, R., Semwal, V.B., Kaushik, P.: Stride segmentation of inertial sensor data using statistical methods for different walking activities. Robotica 1ā€“14 (2021)

    Google ScholarĀ 

  23. Bijalwan, V., Semwal, V.B., Mandal, T.K.: Fusion of multi-sensor-based biomechanical gait analysis using vision and wearable sensor. IEEE Sens. J. 21(13), 14213ā€“14220 (2021)

    ArticleĀ  Google ScholarĀ 

  24. Bijalwan, V., Semwal, V.B., Gupta, V.: Wearable sensor-based pattern mining for human activity recognition: deep learning approach. Ind. Robot.: Int. J. Robot. Res. Appl. (2021)

    Google ScholarĀ 

  25. Dua, N., Singh, S.N., Semwal, V.B.: Multi-input CNN-GRU based human activity recognition using wearable sensors. Computing 103(7), 1461ā€“1478 (2021)

    ArticleĀ  Google ScholarĀ 

  26. Jain, R., Semwal, V.B., Kaushik, P.: Stride segmentation of inertial sensor data using statistical methods for different walking activities. Robotica 1ā€“14 (2021)

    Google ScholarĀ 

  27. Semwal, V.B., Gaud, N., Lalwani, P., Bijalwan, V., Alok, A.K.: Pattern identification of different human joints for different human walking styles using inertial measurement unit (IMU) sensor. Artif. Intell. Rev. 1ā€“21 (2021)

    Google ScholarĀ 

  28. Challa, S.K., Kumar, A., Semwal, V.B.: A multibranch CNN-BiLSTM model for human activity recognition using wearable sensor data. Vis. Comput. 1ā€“15 (2021)

    Google ScholarĀ 

  29. Bijalwan, V., Semwal, V.B., Singh, G., Mandal, T.K.: HDL-PSR: modelling spatio-temporal features using hybrid deep learning approach for post-stroke rehabilitation. Neural Process. Lett. 1ā€“20 (2022)

    Google ScholarĀ 

  30. Semwal, V.B., Gupta, A., Lalwani, P.: An optimized hybrid deep learning model using ensemble learning approach for human walking activities recognition. J. Supercomput. 77(11), 12256ā€“12279 (2021)

    ArticleĀ  Google ScholarĀ 

  31. Dua, N., Singh, S.N., Semwal, V.B., Challa, S.K.: Inception inspired CNN-GRU hybrid network for human activity recognition. Multimed. Tools Appl. 1ā€“35 (2022)

    Google ScholarĀ 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to K. M. Umamaheswari .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

Ā© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Umamaheswari, K.M., Vignesh, M.T. (2023). Face Emotion Detection forĀ Autism Children Using Convolutional Neural Network Algorithms. In: Biswas, A., Semwal, V.B., Singh, D. (eds) Artificial Intelligence for Societal Issues. Intelligent Systems Reference Library, vol 231. Springer, Cham. https://doi.org/10.1007/978-3-031-12419-8_10

Download citation

Publish with us

Policies and ethics