Pattern Analysis and Applications

, Volume 19, Issue 2, pp 517–530 | Cite as

Periocular recognition: how much facial expressions affect performance?

  • Elisa Barroso
  • Gil Santos
  • Luis Cardoso
  • Chandrashekhar Padole
  • Hugo Proença
Industrial and Commercial Application

Abstract

Using information near the human eye to perform biometric recognition has been gaining popularity. Previous works in this area, designated periocular recognition, show remarkably low error rates and particularly high robustness when data are acquired under less controlled conditions. In this field, one factor that remains to be studied is the effect of facial expressions on recognition performance, as expressions change the textural/shape information inside the periocular region. We have collected a multisession dataset whose single variation is the subjects’ facial expressions and analyzed the corresponding variations in performance, using the state-of-the-art periocular recognition strategy. The effectiveness attained by different strategies to handle the effects of facial expressions was compared: (1) single-sample enrollment; (2) multisample enrollment, and (3) multisample enrollment with facial expression recognition, with results also validated in the well-known Cohn–Kanade AU-Coded Expression dataset. Finally, the role of each type of facial expression in the biometrics menagerie effect is discussed.

Keywords

Periocular recognition Biometrics 

References

  1. 1.
    Park U, Ross A, Jain A (2009) Periocular biometrics in the visible spectrum: a feasibility study. In: IEEE 3rd International Conference on Biometrics: Theory, Applications, and Systems, 2009. BTAS ’09, pp 1–6Google Scholar
  2. 2.
    Park U, Jillela RR, Ross A, Jain AK (2011) Periocular biometrics in the visible spectrum. IEEE Trans Inf Forensics Secur 6(1):96–106CrossRefGoogle Scholar
  3. 3.
    Lyle J, Miller P, Pundlik S, Woodard D (2010) Soft biometric classification using periocular region features. In: Fourth IEEE International Conference on Biometrics: Theory Applications and Systems (BTAS), 2010, pp 1–7Google Scholar
  4. 4.
    Woodard D, Pundlik S, Miller P, Jillela R, Ross A (2010) On the fusion of periocular and iris biometrics in non-ideal imagery. In: 20th International Conference on Pattern Recognition (ICPR), 2010, pp 201–204Google Scholar
  5. 5.
    Bharadwaj S, Bhatt H, Vatsa M, Singh R (2010) Periocular biometrics: when iris recognition fails,” in Fourth IEEE International Conference on Biometrics: Theory Applications and Systems (BTAS), 2010, pp 1–6Google Scholar
  6. 6.
    Park U, Ross A, Jain A (2012) Matching highly non-ideal ocular images: an information fusion approach. In: IEEE 5th International Conference on Biometrics, ICB2012Google Scholar
  7. 7.
    Hollingsworth K, Darnell S, Miller P, Woodard D, Bowyer K, Flynn P (2012) Human and machine performance on periocular biometrics under near-infrared light and visible light. IEEE Trans Inf Forensics Secur 7(2):588–601CrossRefGoogle Scholar
  8. 8.
    Woodard D, Pundlik S, Miller P, Lyle J (2011) Appearance-based periocular features in the context of face and non-ideal iris recognition. Signal Image Video Process 5:443–455CrossRefGoogle Scholar
  9. 9.
    Crihalmeanu S, Ross A (2011) Multispectral scleral patterns for ocular biometric recognition. Pattern Recognit Lett no. 0Google Scholar
  10. 10.
    Kanade T, Cohn J, Tian YL (2000) Comprehensive database for facial expression analysis. In: Proceedings of the 4th IEEE International Conference on Automatic Face and Gesture Recognition (FG’00), pp 46–53Google Scholar
  11. 11.
    Lucey P, Cohn J, Kanade T, Saragih J, Ambadar Z, Matthews I (2010) The extended Cohn-Kanade dataset (CK+): a complete dataset for action unit and emotion-specified expression. In: Proceedings of the IEEE Computer Vision and Pattern Recognition Workshops (CVPRW’10), pp 94–101Google Scholar
  12. 12.
    Ekman P (1999) Facial expressions. In: Dalgleish T, Power M (eds) Handbook of cognition and emotion, John Wiley & Sons, San Francisco, California, USA, pp 301–320Google Scholar
  13. 13.
    Anitha M, Venkatesha K, Adiga BS (2010) A survey of facial expression databases. Int J Eng Sci Technol 2(10):5158–5174Google Scholar
  14. 14.
    Langner O, Dotsch R, Bijlstra G, Wigboldus DHJ, Hawk ST, van Knippenberg A (2010) Presentation and validation of the Radboud Faces Database. Cognit Emot 24(8):1377–1388CrossRefGoogle Scholar
  15. 15.
    Ebner C, Riediger M, Lindenberger U (2010) FACES-a database of facial expressions in young, middle-aged, and older women and men: development and validation. Behav Res Methods 42(1):351–62CrossRefGoogle Scholar
  16. 16.
    Lyons M, Akamatsu S, Kamachi M, Gyoba J (1998) Coding facial expressions with Gabor wavelets. In: Third IEEE International Conference on Automatic Face Gesture Recognition. IEEE Computer Society, Nara, Japan, pp 200–205Google Scholar
  17. 17.
    Pantic M, Valstar M, Rademaker R, Maat L (2005) Web-based database for facial expression analysis. In: 2005 IEEE International Conference on Multimedia and Expo, pp 317–321Google Scholar
  18. 18.
    Bettadapura VK (2009) Face expression recognition and analysis : the state of the art. Emotion, pp 1–27Google Scholar
  19. 19.
    Haq S, Jackson P (2010) Machine audition: principles, algorithms and systems. In: Multimodal Emotion Recognition. IGI Global ch., Hershey PA, pp 398–423Google Scholar
  20. 20.
    Sebe N, Lew M, Sun Y, Cohen I, Gevers T, Huang T (2007) Authentic facial expression analysis. Image Vis Comput 25(12):1856–1863CrossRefGoogle Scholar
  21. 21.
    Sim T, Baker S, Bsat M (2003) The cmu pose, illumination, and expression database. IEEE Trans Pattern Anal Mach Intell 25:1615–1618CrossRefGoogle Scholar
  22. 22.
    Gross R (2005) Face databases. In: Handbook of face recognition. Springer, ch 13, pp 301–327Google Scholar
  23. 23.
    Phillips PJ, Moon H, Rizvi SA, Rauss PJ (2000) The feret evaluation methodology for face-recognition algorithms. IEEE Trans Pattern Anal Mach Intell 22(10):1090–1104CrossRefGoogle Scholar
  24. 24.
    Hwang BW, Byun H, Roh MC, Lee SW (2003) Performance evaluation of face recognition algorithms on the asian face database, kfdb. In: Proceedings of the 4th international conference on Audio- and video-based biometric person authentication, ser. AVBPA’03. Springer-Verlag, Berlin, Heidelberg, pp 557–565Google Scholar
  25. 25.
    O’Toole AJ, Harms J, Snow SL, Hurst DR, Pappas MR, Ayyad JH, Abdi H (2005) A video database of moving faces and people. IEEE Trans Pattern Anal Mach Intell 27:812–816CrossRefGoogle Scholar
  26. 26.
    Yin L, Wei X, Sun Y, Wang J, Rosato MJ (2006) A 3D facial expression database for facial behaviour research. In: Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition (FGR06). IEEE Computer SocietyGoogle Scholar
  27. 27.
    Guyon I, Makhoul J, Schwartz R, Vapnik V (1998) What size test set gives good error rate estimates ? IEEE Trans Pattern Anal Mach Intell 20(1):52–64CrossRefGoogle Scholar
  28. 28.
    Cantor ABM (2002) Understanding logistic regression. Evid Oncol 3(2):52–53MathSciNetCrossRefGoogle Scholar
  29. 29.
    Yager N, Dunstone T (2010) The biometric menagerie. IEEE Trans Pattern Anal Mach Intell 32(2):220–230CrossRefGoogle Scholar
  30. 30.
    Poh N, Kittler J (2009) A biometric menagerie index for characterizing template/model-specific variation. In: Proceedings of the Third International Conference on Advances in Biometrics- BTAS 09, pp 816–827Google Scholar

Copyright information

© Springer-Verlag London 2015

Authors and Affiliations

  • Elisa Barroso
    • 1
  • Gil Santos
    • 1
  • Luis Cardoso
    • 1
  • Chandrashekhar Padole
    • 1
  • Hugo Proença
    • 1
  1. 1.Department of Computer Science, IT-Instituto de Telecomunicações University of Beira InteriorCovilhãPortugal

Personalised recommendations