Skip to main content
Log in

Research on emotion-embedded design flow based on deep learning technology

  • Published:
International Journal of Technology and Design Education Aims and scope Submit manuscript

Abstract

Designers are always pursuing design with suitable emotions. Effective emotional fusion not only produces a good user experience but also extends the product lifecycle. The decoding of design emotion and the use of design emotion language should run through the entire design process. In this study, we propose a new emotion-embedded design flow (EFlow) based on design big data and deep learning technology. This method focuses on how emotion is input into the design process and improves the effectiveness of emotional design. An emotion database containing 2054 labeled images is collected and a deep fuzzy classification network is proposed. Through realizing the automatic emotional judgment of the design reference materials and the design output content using the deep learning technology, EFlow not only saves manpower and test cost but also provides a reference that a designer can use to optimize and improve the design process. It promotes a new way of thinking about connecting artificial intelligence technology and the design field.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  • Arandjelovic, R, Gronat, P, Torii, A, Pajdla, T and Sivic, J 9(2016). Net VLAD: CNN architecture for weakly supervised place recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition,: 5297–5307.

  • Autoencoders, B. P. (2012). Unsupervised learning, and deep architectures. ICML Unsupervised and Transfer Learning, 27, 37–50.

    Google Scholar 

  • Borth, D., Ji, R., Chen, T., Breuel, T. and Chang, S.F. (2013). Large-scale visual sentiment ontology and detectors using adjective noun pairs. In: Proceedings of the ACM International Conference on Multimedia, 223 ∼ 232.

  • Chen, T., Borth, D., Darrell, T and Chang, S F.(2014). Deep Senti Bank: Visual sentiment concept vlassification with deep convolutional neural networks. ar Xiv preprint ar Xiv:1410.8586.

  • Desmet, P. M. A., and Hekkert, P. (2001) The basis of product emotions. In P. Jordan and W. S. Green (Eds.), Procccdings of the Pleosure Based Human Factors Conferencc, Copenhagen, Denmark.

  • Desmet, P. (2012). Faces of Product Pleasure: 25 Positive Emotions in Human-Product Interactions. International Journal of Design,1–29.

  • Desmet, P. (2002). Designing emotions. Delft University of Technology.

    Google Scholar 

  • Desmet, P., Overbeeke, K., & Tax, S. (2001). Designing Products with Added Emotional Value: Development and Application of an Approach for Research through Design. The Design Journal, 4(1), 32–47.

    Article  Google Scholar 

  • Donald, A. (2005). Norman. Publishing house of electronics industry.

    Google Scholar 

  • Larochelle, H., Bengio, Y., Louradour, J., & Lamblin, P. (2009). Exploring strategies for training deep neural networks. Journal of Machine Learning Research, 10(1), 1–40.

  • Liu P., Han S., Meng Z., et al.( 2014). Facial expression recognition via a boosted deep belief network. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 1805–1812.

  • Machajdik, J and Hanbury, A. Affective Image Classification using features in-Machajdik, J and Hanbury, A.(2010). Affective Image Classification using features inspired by psychology and art theory. In: Proceedings of the ACM International Conference on Multimedia, 83–92.

  • Mehrabian, A. (1995). Framework for a comprehensive description and measurement of emotional states. Genetic, Social & General Psychology Monographs, 339–361.

  • Ogawa, T., Nagai, Y., & Ikeda, M. (2009). An ontological approach to designersʾidea explanation style: Towards supporting the sharing of Kanse-i ideas in textile design. Advanced Engineering Informatics, 23, 157–164.

    Article  Google Scholar 

  • Ortony, A., Clore, G. L., & Collins, A. (1988). Tire cogllitilJe structure of emotiolls. Cambridge University Press.

    Google Scholar 

  • Porter, C. S., Porter, J. M., & Chhibber, S. (2007, October). Realpeople; capturing the emotions of product users. In Meeting Diversity in Ergonomics (pp. 187–208). London: Elsevier.

  • Ren, S, He, K, Girshick, R and Sun, J. (2015) Faster R-CNN: Towards real-time object detection with region proposal networks. In: Proceedings of the Advances inNeural Information Processing Systems,: 91–99.

  • Sarrafzadeh A, Alexander S, Dadgostar F, et al(2008). “How do you know that I don’t understand ?” A look at the future of intelligent tutoring systems. Computers in Human Behavior, 24(4):1342–1363.

  • Shang, H. H., Chuang, M. C., & Chang, C. C. (2000). A se-mantic differential study of designersʾand usersʾprod-uct form perception. International Journal of Indus-Trial Ergonomics, 25, 375–391.

    Article  Google Scholar 

  • Simonyan, K and Zisserman, A (2015) xxxx

  • Simonyan, K and Zisserman, A (2015) Very deep convolutional networks for large-scale image recognition. In: International Conference on Learning Representations

  • Stacey, P. K., & Bruce, S. (2015). Designing emotion-centred Product Service Systems: the case of a cancer care facility. Design Studies, 40, 85–118.

    Article  Google Scholar 

  • Teresa, A., & Stefano, B. (2019). Emotional design: The development of a process to envision emotion-centric new product ideas. Procedia Computer Science, 158, 474–484.

    Article  Google Scholar 

  • Terwiesch, C., & Ulrich, K. T. (2009). Innovation tournaments: Creating and selecting exceptional opportunities. Harvard Business Press.

  • Tim B(2009). Change by design, How Design Thinking Transforms organizations, HarperBusiness.

  • Wang Z., Lyu S., Schalk G., et al.(2013). Deep Feature Learning Using Target Priors with Applications in ECoG Signal Decoding for BCI. IJCAI. 1785–1791.

  • You, Q, Luo, J, Jin, H and Yang, J.(2016). Building a large scale dataset for image emotion recognition: The fine print and the benchmark. In: Proceedings of the theAAAI Conference on Artificial Intelligence, 308–314.

  • You, W., Sun, L., Yang, Z., et al. (2019). Automatic advertising image color design incorporating a visual color analyzer[J]. Journal of Computer Languages, 55, 100910.

    Article  Google Scholar 

  • Zhao, T., Yang, J., Zhang, H., et al. (2019). Creative idea generation method based on deep learning technology. International Journal of Technology and Design Education, 4, 1–20.

    Google Scholar 

Download references

Acknowledgements

We would like to express our gratitude to the designers and participants for their contributions to the current study. This study was supported by the National Natural Science Foundation of China (62277038).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Junyu Yang.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhao, T., Jia, J., Zhu, T. et al. Research on emotion-embedded design flow based on deep learning technology. Int J Technol Des Educ 34, 345–362 (2024). https://doi.org/10.1007/s10798-023-09815-z

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10798-023-09815-z

Keywords

Navigation