Skip to main content

Generation of Realistic Robot Facial Expressions for Human Robot Interaction


One factor that contributes to successful long-term human-human interaction is that humans are able to appropriately express their emotions depending on situations. Unlike humans, robots often lack diversity in facial expressions and gestures and long-term human robot interaction (HRI) has consequently not been very successful thus far. In this paper, we propose a novel method to generate diverse and more realistic robot facial expressions to help long-term HRI. First, nine basic dynamics for robot facial expressions are determined based on the dynamics of human facial expressions and principles of animation in order to generate natural and diverse expression changes in a facial robot for identical emotions. In the second stage, facial actions are added to express more realistic expressions such as sniffling or wailing loudly corresponding to sadness, laughing aloud or smiling corresponding to happiness, etc. To evaluate the effectiveness of our approach, we compared the facial expressions of the developed robot with and without use of the proposed method. The results of the survey showed that the proposed method can help robots generate more realistic and diverse facial expressions.

This is a preview of subscription content, access via your institution.


  1. Salvine, P., Nicolescu, M., Ishiguro, H.: Benefits of human-robot interactions. IEEE Robot. Autom. Mag. 18(4), 98–99 (2011)

    Article  Google Scholar 

  2. Mehrabian, A.: Nonverbal communication. Aldine- Atherton (1972)

  3. Leite, I., Marinoh, C., Paiva, A.: Social robots for long-term interaction: A survey. Int. J. Soc. Robot. 5(2), 291–308 (2013)

    Article  Google Scholar 


  5. Miwa, H., Okuchi, T., Itoh, K., Takanobu, H., Takanishi, A.: A new mental model for humanoid robots for human friendly communication - Introduction of learning system, moodvector and second order equations of emotion. In: IEEE International Conference on Robotics Automation (ICRA), pp. 3588–3593. IEEE Press, Taiwan (2003)



  8. Yoo, B-S., Cho, S-., Kim, J-H.: Fuzzy integral-based composite facial expression generation for a robotic head. In: IEEE International Conference on Fuzzy Systems (FUZZ), pp. 917–923. IEEE Press, Taiwan (2011)


  10. Van Breemen, A.J.N.: Animation engine for believable interactive user-interface robots. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2873–2878. IEEE Press, Sendai (2004)


  12. Matsui, Y., Kanoh, M., Kato, S., Nakamura, T., Itoh, H.: A model for generating facial expressions using virtual emotion based on simple recurrent network. J. Adv. Comput. Intell. Intell. Inform. 14(5), 453–463 (2010)

    Google Scholar 

  13. Kim, H.-R., Kwon, D.-S.: Computational model of emotion generation for human-robot interaction based on the cognitive appraisal theory. J. Intell. Robot. Syst. 60(2), 263–283 (2010)

    Article  MATH  Google Scholar 

  14. Oh, K.-G., Jang, M.-S., Kim, S.-J.: Automatic emotional expression of a face robot by using a reactive behavior decision model. J. Mech. Sci. Technol. 24(3), 769–774 (2010)

    Article  Google Scholar 



  17. Lee, H.S., Park, J.W., Chung, M.J.: A linear affect–expression space model and control points for mascot-type facial robots. IEEE Trans. Robot. 23(5), 863–873 (2007)

    Article  Google Scholar 

  18. Ekman, P.: An argument for basic emotions. Cognit. emot. 6(3–4), 169–200 (1992)

    Article  Google Scholar 

  19. Waters, K., Terzopoulos, D.: A physical model of facial tissue and muscle articulation. In: IEEE International Conference on Visualization in Biomedical Computing, pp. 77–82, IEEE Press (1990)

  20. Terzopoulos, D., Waters, K.: Physically-based facial modeling, analysis, and animation. J. Vis. Comput. Animat. 1(2), 73–80 (1990)

    Article  Google Scholar 

  21. Lee, H.S., Park, J.W., Jo, S.H., Chung, M.J.: A linear dynamic affect-expression model:facial expressions according to perceived emotions in mascot-type facial robots. In: International Conference on Symposium on Robot and Human Interactive Communication (ROMAN), pp. 619–624. IEEE Press, Jeju (2007)

  22. Ekman, P., Yun, Y.I.: Telling Lies, (Korean Translation). Dongin Publishing, Seoul (1997)

    Google Scholar 

  23. Lasseter, J.: Principle of traditional animation applied to 3D computer animation. Comput. Graph. 21(4), 35– 44 (1987)

    Article  Google Scholar 

  24. Dorf, R.C., Bishop, R.H.: Mordern Control Systems, 9th ed., pp 232–233. Prentice Hall, New Jersey (2001)

  25. Park, J.W., Lee, H.S., Jo, S.H., Kim, M.G., Chung, M.J.: Emotional boundaries for choosing modalities according to the intensity of emotion in a linear affect-expression space. In: IEEE International Symposium on Robot and Human Interactive Communication (ROMAN), pp. 225–230. IEEE Press, Munich (2008)

  26. Elston, J.S., Granje, F.C., Lees, A.J.: The relationship between eye-winking tics, frequent eye-blinking and blepharospasm. J. Neurol. Neurosurg. Psychiatr. 52(4), 477–480 (1989)

    Article  Google Scholar 


  28. Ekman, P., Friesen, W.V., Hager, J.C.: Facial action coding system/the manual on CD Rom research nexus division of network information research corporation. Salt Lake City (2002)

  29. Lee, H.S., Park, J.W., Chung, M.J.: An affect-expression space model of the face in a mascot-type robot. In: Proceeding of IEEE/RAS International Conference on Humanoid, pp. 412–0417. IEEE Press, Italy (2006)

Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to Myung Jin Chung.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Park, J.W., Lee, H.S. & Chung, M.J. Generation of Realistic Robot Facial Expressions for Human Robot Interaction. J Intell Robot Syst 78, 443–462 (2015).

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:


  • Dynamics
  • Facial actions
  • Realistic facial expressions
  • Human robot interaction

Mathematic Subject Classification (2010)

  • 70E60
  • 68T40