Computer Vision/Computer Graphics Collaboration Techniques

Volume 4418 of the series Lecture Notes in Computer Science pp 555-566

Classification of Facial Expressions Using K-Nearest Neighbor Classifier

  • Abu Sayeed Md. SohailAffiliated withDepartment of Computer Science and Software Engineering, Concordia University, 1455 de Maisonneuve Blvd. West, Montreal, Quebec H3G 1M8
  • , Prabir BhattacharyaAffiliated withConcordia Institute for Information Systems Engineering, Concordia University, 1515 St. Catherine West, Montreal, Quebec H3G 2W1

* Final gross prices may vary according to local VAT.

Get Access


In this paper, we have presented a fully automatic technique for detection and classification of the six basic facial expressions from nearly frontal face images. Facial expressions are communicated by subtle changes in one or more discrete features such as tightening the lips, raising the eyebrows, opening and closing of eyes or certain combinations of them. These discrete features can be identified through monitoring the changes in muscles movement (Action Units) located near about the regions of mouth, eyes and eyebrows. In this work, we have used eleven feature points that represent and identify the principle muscle actions as well as provide measurements of the discrete features responsible for each of the six basic human emotions. A multi-detector approach of facial feature point localization has been utilized for identifying these points of interests from the contours of facial components such as eyes, eyebrows and mouth. Feature vector composed of eleven features is then obtained by calculating the degree of displacement of these eleven feature points from a non-changeable rigid point. Finally, the obtained feature sets are used for training a K-Nearest Neighbor Classifier so that it can classify facial expressions when given to it in the form of a feature set. The developed Automatic Facial Expression Classifier has been tested on a publicly available facial expression database and on an average 90.76% successful classification rate has been achieved.