Skip to main content
Log in

Multimodal emotion recognition algorithm based on edge network emotion element compensation and data fusion

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

The data feature set of emotion recognition based on complex network has the characteristics of complex redundant information, difficult recognition and lost data, so it will cause great interference to the emotion feature of speech or image recognition. In order to solve the above problems, this paper studies the multi-modal emotion recognition algorithm based on emotion element compensation in the background of streaming media communication in edge network. Firstly, an edge streaming media network is designed to transfer the traditional server-centric transmission tasks to edge nodes. The architecture can transform complex network problems into edge nodes and user side problems. Secondly, the multi-modal parallel training is realized by using the cooperative combination of weights equalization, and the reasoning of nonlinear mapping is mapped to a better emotional data fusion relationship. Then, from the point of view of non-linearity and uncertainty of different types of emotional data samples in the training subset, emotional recognition data compensation evolves into emotional element compensation, which is convenient for qualitative analysis and optimal decision-making. Finally, the simulation results show that the proposed multi-modal emotion recognition algorithm can improve the recognition rate by 3.5%, save the average response time by 5.7% and save the average number of iterations per unit time by 1.35 times.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Hossain MS, Muhammad G, Alhamid MF et al (2016) Audio-visual emotion recognition using big data towards 5G[J]. Mobile Netw Appl 21(5):1–11

    Google Scholar 

  2. Fan Y, Lu X, Li D et al (2016) Video-based emotion recognition using CNN-RNN and C3D hybrid networks[C]// ACM international conference on multimodal interaction. ACM:445–450

  3. Dhall A, Goecke R, Joshi J, Emoti W et al (2016) Video and group-level emotion recognition challenges[C]//ACM international conference on multimodal interaction. ACM 2016:427–432

    Google Scholar 

  4. Zheng WL, Lu BL (2017) Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks [J]. IEEE Trans Auton Ment Dev 7(3):162–175

    Article  Google Scholar 

  5. Zhao M, Adib F, Katabi D (2017) Emotion recognition using wireless signals[C]// international conference on mobile computing and networking. ACM:95–108

  6. Papageorgiou A, Cheng B, Kovacs E (2016) Real-time data reduction at the network edge of internet-of-things systems[C]//International Conference on Network and Service Management. IEEE:284–291

  7. Kunz J, Becker C, Jamshidy M, et al. (2016) OpenEdge: a dynamic and secure open service edge network[C]//Network operations and management symposium. IEEE:257–264. Edge streaming

  8. Sajjad HP, Danniswara K, Al-Shishtawy A et al (2016) SpanEdge: towards unifying stream processing over central and near-the-edge data centers[C]// edge computing. IEEE:168–178

  9. Hong K R, Sun D, Zhao X, et al. (2017) The influence of Participant's emotion on land expropriation compensation strategy——based on RDEU evolutionary game theory[J]. Modern Finance and Economics-Journal of Tianjin University of Finance and Economics (9):40–51

  10. Ojha S, Williams MA (2016) Ethically-guided emotional responses for social robots: should I be angry?[C]// International Conference on Social Robotics. Springer International Publishing, Switzerland, pp 233–242

    Google Scholar 

  11. Dzafic I, Burianová H, Periyasamy S et al (2018) Association between schizophrenia polygenic risk and neural correlates of emotion perception. Psychiatry Res Neuroimaging 276:33–40 Data fusion

    Article  Google Scholar 

  12. Roy A, Mihailovic I, Zwaenepoel W (2013) X-stream:edge-centric graph processing using streaming partitions[C]//twenty-fourth ACM symposium on operating systems principles. ACM:472–488

  13. Mäkinen O (2016) Streaming at the edge: local service concepts utilizing Mobile edge computing[C]// international conference on next generation mobile applications, services and technologies. IEEE:1–6

  14. Bilal K, Erbad A (2017) Edge computing for interactive media and video streaming[C]// International Conference on Fog & Mobile Edge Computing. IEEE:68–73. Emotion compensation

  15. Zhang S, Hsee CK, Yu X (2018) Small economic losses lower total compensation for victims of emotional losses [J]. Organ Behav Hum Decis Processes 144:1–10

    Article  Google Scholar 

  16. Kulkarni P U, Bharate V D, Chaudhari D S (2016) Human emotions recognition using adaptive sublayer compensation and various feature extraction mechanisms[C]// international conference on wireless communications, signal processing and networking, IEEE

  17. Bareinboim E, Pearl J (2016) Causal inference and the data-fusion problem [J]. Proc Natl Acad Sci U S A 113(27):7345–7352

    Article  Google Scholar 

  18. Schmitt M, Zhu XX (2016) Data fusion and remote sensing: an ever-growing relationship [J]. IEEE Geosci Remote Sens Mag 4(4):6–23

    Article  Google Scholar 

  19. Ambühl L, Menendez M (2016) Data fusion algorithm for macroscopic fundamental diagram estimation [J]. Transp Res Part C Emerg Technol 71:184–197

    Article  Google Scholar 

  20. Sahoo S, Routray A (2017) Emotion recognition from audio-visual data using rule based decision level fusion[C]// Technology Symposium. IEEE:7–12

Download references

Funding

This work is supported in part by the Key scientific research projects of Henan Province Education Department (No.18A520004), Henan Province Science and Technology projects (No. 182102310925), and National Natural Science Foundation of China (No. 61802115).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yu Wang.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, Y. Multimodal emotion recognition algorithm based on edge network emotion element compensation and data fusion. Pers Ubiquit Comput 23, 383–392 (2019). https://doi.org/10.1007/s00779-018-01195-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-018-01195-9

Keywords

Navigation