The Expression of Mental States in a Humanoid Robot

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10498)


We explore to what degree movement together with facial features in a humanoid robot, such as eyes and mouth, can be used to convey mental states. Several animation variants were iteratively tested in a series of experiments to reach a set of five expressive states that can be reliably expressed by the robot. These expressions combine biologically motivated cues such as eye movements and pupil dilation with elements that only have a conventional significance, such as changes in eye color.


Mental states Emotions Humanoid robot 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Breazeal, C.: Toward sociable robots. Robotics and Autonomous Systems 42(3), 167–175 (2003)CrossRefGoogle Scholar
  2. 2.
    Foerster, F., Bailly, G., Elisei, F.: Impact of iris size and eyelids coupling on the estimation of the gaze direction of a robotic talking head by human viewers. In: 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids), pp. 148–153. IEEE (2015)Google Scholar
  3. 3.
    Gielniak, M.J., Liu, C.K., Thomaz, A.L.: Generating human-like motion for robots. The International Journal of Robotics Research 32(11), 1275–1301 (2013)CrossRefGoogle Scholar
  4. 4.
    Onuki, T., Ishinoda, T., Tsuburaya, E., Miyata, Y., Kobayashi, Y., Kuno, Y.: Designing robot eyes for communicating gaze. Interaction Studies 14(3), 451–479 (2013)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Lund University Cognitive ScienceLundSweden

Personalised recommendations