Advertisement

Behavioral State Detection of Newborns Based on Facial Expression Analysis

  • Lykele Hazelhoff
  • Jungong Han
  • Sidarto Bambang-Oetomo
  • Peter H. N. de With
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5807)

Abstract

Prematurely born infants are observed at a Neonatal Intensive Care Unit (NICU) for medical treatment. Whereas vital body functions are continuously monitored, their incubator is covered by a blanket for medical reasons. This prevents visual observation of the newborns during most time of the day, while it is known that the facial expression can give valuable information about the presence of discomfort.

This prompted the authors to develop a prototype of an automated video survey system for the detection of discomfort in newborn babies by analysis of their facial expression. Since only a reliable and situation-independent system is useful, we focus at robustness against non-ideal viewpoints and lighting conditions. Our proposed algorithm automatically segments the face from the background and localizes the eye, eyebrow and mouth regions. Based upon measurements in these regions, a hierarchical classifier is employed to discriminate between the behavioral states sleep, awake and cry.

We have evaluated the described prototype system on recordings of three healthy newborns, and we show that our algorithm operates with approximately 95% accuracy. Small changes in viewpoint and lighting conditions are allowed, but when there is a major reduction in light, or when the viewpoint is far from frontal, the algorithm fails.

Keywords

Facial Expression Mouth Region Viewpoint Change Healthy Newborn Heel Lance 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Grunau, R.V.E., Craig, K.D.: Pain expression in neonates: facial action and cry. Pain 28(3), 395–410 (1987)CrossRefGoogle Scholar
  2. 2.
    Stevens, B., Johnston, C., Pethryshen, P., Taddio, A.: Premature Infant Pain Profile: Development and Initial Validation. Clin. J. Pain 12(1), 13–22 (1996)CrossRefGoogle Scholar
  3. 3.
    Chen, K., Chang, S., Hsiao, T., Chen, Y., Lin, C.: A neonatal facial image scoring system (NFISS) for pain response studies. BME ABC, 79–85 (2005)Google Scholar
  4. 4.
    Brahnam, S., Chuang, C., Shih, F.Y., Slack, M.R.: Machine recognition and representation of neonatal facial displays of acute pain. Artificial Intelligence in Medicine 36(3), 211–222 (2006)CrossRefGoogle Scholar
  5. 5.
    Brahnam, S., Chuang, C., Sexton, R.S., Shih, F.Y.: Machine assessment of neonatal facial expressions of acute pain. Special Issue on Decision Support in Medicine in Decision Support Systems 43, 1247–1254 (2007)Google Scholar
  6. 6.
    Peng, K., Chen, L.: A Robust Algorithm for Eye Detection on Gray Intensity Face without Spectacles. Journal of Computer Science and Technology, 127–132 (2005)Google Scholar
  7. 7.
    Vezhnevets, V., Degtiareva, A.: Robust and Accurate Eye Contour Extraction. In: Proc. Graphicon, pp. 81–84 (2003)Google Scholar
  8. 8.
    Asteriadis, S., Nikolaidis, N., Hajdu, A., Pitas, I.: A novel eye-detection algorithm utilizing edge-related geometrical information. In: EUSIPCO 2006 (2006)Google Scholar
  9. 9.
    Sohail, A.S.M., Bhattacharya, P.: Classification of facial expressions using k-nearest neighbor classifier. In: Gagalowicz, A., Philips, W. (eds.) MIRAGE 2007. LNCS, vol. 4418, pp. 555–566. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  10. 10.
    Michel, P., Kaliouby, R.: Real time facial expression recognition in video using support vector machines. In: ICMI (2003)Google Scholar
  11. 11.
    Ioannou, S.V., Raouzaiou, A.T., Tzouvaras, V.A., Mailis, T.P., Karpouzis, K.C., Kollias, S.D.: Emotion recognition through facial expression analysis based on a neurofuzzy network. Neural Networks 18, 423–435 (2005)CrossRefGoogle Scholar
  12. 12.
    Tian, Y., Kanade, T., Cohn, J.F.: Recognizing action units for facial expression analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence 23(2), 97–115 (2001)CrossRefGoogle Scholar
  13. 13.
    Chen, Q., Cham, W., Lee, K.: Extracting eyebrow contour and chin contour for face recognition. Pattern Recognition 40(8), 2292–2300 (2007)CrossRefzbMATHGoogle Scholar
  14. 14.
    Gomez, E., Travieso, C.M., Briceno, J.C., Ferrer, M.A.: Biometric identification system by lip shape, Security Technology (2002)Google Scholar
  15. 15.
    Leung, S., Wang, S., Lau, W.: Lip image segmentation using fuzzy clustering incorporating an elliptic shape function. IEEE Transactions on Image Processing 13(1), 51–61 (2004)CrossRefGoogle Scholar
  16. 16.
    Gocke, R., Millar, J.B., Zelensky, A., Robert-Ribes, J.: Automatic extraction of lip feature points. In: Proc. of ACRA 2000, pp. 31–36 (2000)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Lykele Hazelhoff
    • 1
    • 2
  • Jungong Han
    • 1
  • Sidarto Bambang-Oetomo
    • 1
    • 3
  • Peter H. N. de With
    • 1
    • 2
  1. 1.University of Technology EindhovenThe Netherlands
  2. 2.CycloMedia Technology B.V.WaardenburgThe Netherlands
  3. 3.Maxima Medisch CentrumVeldhovenThe Netherlands

Personalised recommendations