Abstract
One factor that contributes to successful long-term human-human interaction is that humans are able to appropriately express their emotions depending on situations. Unlike humans, robots often lack diversity in facial expressions and gestures and long-term human robot interaction (HRI) has consequently not been very successful thus far. In this paper, we propose a novel method to generate diverse and more realistic robot facial expressions to help long-term HRI. First, nine basic dynamics for robot facial expressions are determined based on the dynamics of human facial expressions and principles of animation in order to generate natural and diverse expression changes in a facial robot for identical emotions. In the second stage, facial actions are added to express more realistic expressions such as sniffling or wailing loudly corresponding to sadness, laughing aloud or smiling corresponding to happiness, etc. To evaluate the effectiveness of our approach, we compared the facial expressions of the developed robot with and without use of the proposed method. The results of the survey showed that the proposed method can help robots generate more realistic and diverse facial expressions.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Salvine, P., Nicolescu, M., Ishiguro, H.: Benefits of human-robot interactions. IEEE Robot. Autom. Mag. 18(4), 98–99 (2011)
Mehrabian, A.: Nonverbal communication. Aldine- Atherton (1972)
Leite, I., Marinoh, C., Paiva, A.: Social robots for long-term interaction: A survey. Int. J. Soc. Robot. 5(2), 291–308 (2013)
Miwa, H., Okuchi, T., Itoh, K., Takanobu, H., Takanishi, A.: A new mental model for humanoid robots for human friendly communication - Introduction of learning system, moodvector and second order equations of emotion. In: IEEE International Conference on Robotics Automation (ICRA), pp. 3588–3593. IEEE Press, Taiwan (2003)
http://www.takanishi.mech.waseda.ac.jp/top/research/we/we-4rii/index.htm
http://www.lsr.ei.tum.de/research/videos/human-robot-interaction/emotion-display-eddie/
Yoo, B-S., Cho, S-., Kim, J-H.: Fuzzy integral-based composite facial expression generation for a robotic head. In: IEEE International Conference on Fuzzy Systems (FUZZ), pp. 917–923. IEEE Press, Taiwan (2011)
Van Breemen, A.J.N.: Animation engine for believable interactive user-interface robots. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2873–2878. IEEE Press, Sendai (2004)
Matsui, Y., Kanoh, M., Kato, S., Nakamura, T., Itoh, H.: A model for generating facial expressions using virtual emotion based on simple recurrent network. J. Adv. Comput. Intell. Intell. Inform. 14(5), 453–463 (2010)
Kim, H.-R., Kwon, D.-S.: Computational model of emotion generation for human-robot interaction based on the cognitive appraisal theory. J. Intell. Robot. Syst. 60(2), 263–283 (2010)
Oh, K.-G., Jang, M.-S., Kim, S.-J.: Automatic emotional expression of a face robot by using a reactive behavior decision model. J. Mech. Sci. Technol. 24(3), 769–774 (2010)
Lee, H.S., Park, J.W., Chung, M.J.: A linear affect–expression space model and control points for mascot-type facial robots. IEEE Trans. Robot. 23(5), 863–873 (2007)
Ekman, P.: An argument for basic emotions. Cognit. emot. 6(3–4), 169–200 (1992)
Waters, K., Terzopoulos, D.: A physical model of facial tissue and muscle articulation. In: IEEE International Conference on Visualization in Biomedical Computing, pp. 77–82, IEEE Press (1990)
Terzopoulos, D., Waters, K.: Physically-based facial modeling, analysis, and animation. J. Vis. Comput. Animat. 1(2), 73–80 (1990)
Lee, H.S., Park, J.W., Jo, S.H., Chung, M.J.: A linear dynamic affect-expression model:facial expressions according to perceived emotions in mascot-type facial robots. In: International Conference on Symposium on Robot and Human Interactive Communication (ROMAN), pp. 619–624. IEEE Press, Jeju (2007)
Ekman, P., Yun, Y.I.: Telling Lies, (Korean Translation). Dongin Publishing, Seoul (1997)
Lasseter, J.: Principle of traditional animation applied to 3D computer animation. Comput. Graph. 21(4), 35– 44 (1987)
Dorf, R.C., Bishop, R.H.: Mordern Control Systems, 9th ed., pp 232–233. Prentice Hall, New Jersey (2001)
Park, J.W., Lee, H.S., Jo, S.H., Kim, M.G., Chung, M.J.: Emotional boundaries for choosing modalities according to the intensity of emotion in a linear affect-expression space. In: IEEE International Symposium on Robot and Human Interactive Communication (ROMAN), pp. 225–230. IEEE Press, Munich (2008)
Elston, J.S., Granje, F.C., Lees, A.J.: The relationship between eye-winking tics, frequent eye-blinking and blepharospasm. J. Neurol. Neurosurg. Psychiatr. 52(4), 477–480 (1989)
Ekman, P., Friesen, W.V., Hager, J.C.: Facial action coding system/the manual on CD Rom research nexus division of network information research corporation. Salt Lake City (2002)
Lee, H.S., Park, J.W., Chung, M.J.: An affect-expression space model of the face in a mascot-type robot. In: Proceeding of IEEE/RAS International Conference on Humanoid, pp. 412–0417. IEEE Press, Italy (2006)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Park, J.W., Lee, H.S. & Chung, M.J. Generation of Realistic Robot Facial Expressions for Human Robot Interaction. J Intell Robot Syst 78, 443–462 (2015). https://doi.org/10.1007/s10846-014-0066-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10846-014-0066-1