3-D emotion space for interactive communication
In this paper, the methods for modeling facial expression and emotion arc proposed. This Emotion Model, called 3-D Emotion Space can represent both human and computer emotion conditions appearing on the face as a coordinate in the 3-D Space.
For the construction of this 3-D Emotion Space, 5-laycr neural network which is superior in non-linear mapping performance is applied. After the network training with backpropagalion to realize Identity Mapping, both mapping from facial expression parameters to the 3-D Emotion Space and inverse mapping from the Emotion Space to the expression parameters were realized.
As a result a system which can analyze and synthesize the facial expression were constructed simultaneously.
Moreover, this inverse mapping to the facial expression is evaluated by the subjective evaluation using the synthesized expressions as test images. This evaluation result proved the high performance to describe natural facial expression and emotion condition with this model.
Unable to display preview. Download preview PDF.