Advertisement

Dynamic Human Fatigue Detection Using Feature-Level Fusion

  • Xiao Fan
  • Bao-Cai Yin
  • Yan-Feng Sun
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5099)

Abstract

Driver fatigue is a significant factor in many traffic accidents. We propose a novel dynamic features using feature-level fusion for driver fatigue detection from facial image sequences. First, Gabor filters are employed to extract multi-scale and multi-orientation features from each image, which are then merged according to a fusion rule to produce a single feature. To account for the temporal aspect of human fatigue, the fused image sequence is divided into dynamic units, and a histogram of each dynamic unit is computed and concatenated as dynamic features. Finally a statistical learning algorithm is applied to extract the most discriminative features and construct a strong classifier for fatigue detection. The test data contains 600 image sequences from thirty people. Experimental results show the validity of the proposed approach, and the correct rate is much better than the baselines.

Keywords

Computer vision human fatigue Gabor filters fusion AdaBoost 

References

  1. 1.
    Wang, R., Guo, K., Shi, S., Chu, J.: A Monitoring Method of Driver Fatigue Behavior Based on Machine Vision. In: Proceedings on intelligent Vehicles Symposium, pp. 110–113 (2003)Google Scholar
  2. 2.
    Miyakawa, T., Takano, H., Nakamura, K.: Development of Non-contact Real-time Blink Detection System for Doze Alarm. In: SICE, vol. 2, pp. 1626–1631 (2004)Google Scholar
  3. 3.
    Dong, W., Wu, X.: Fatigue Detection Based on the Distance of Eyelid. In: IEEE International Workshop on VLSI Design and Video Technology, pp. 365–368 (2005)Google Scholar
  4. 4.
    Wang, R., Guo, L., Tong, B., Jin, L.: Monitoring Mouth Movement for Driver Fatigue or Distraction with One Camera. In: The 7th International IEEE Conference on Intelligent Transportation Systems, pp. 314–319 (2004)Google Scholar
  5. 5.
    Wang, T., Shi, P.: Yawning Detection for Determining Driver Drowsiness. In: IEEE International Workshop on VLSI Design and Video Technology, pp. 373–376 (2005)Google Scholar
  6. 6.
    Bassili, J.: Emotion Recognition: The Role of Facial Movement and the Relative Importance of Upper and Lower Areas of the Face. J. Journal of Personality and Social Psychology 37, 2049–2059 (1979)CrossRefGoogle Scholar
  7. 7.
    Zhao, G., Pietikainen, M.: Dynamic Texture Recognition Using Local Binary Patterns with an Application to Facial Expressions. J. IEEE Transactions on Pattern Analysis and Machine Intelligence 29(6), 915–928 (2007)CrossRefGoogle Scholar
  8. 8.
    Yang, P., Liu, Q., Metaxas, D.N.: Boosting Coded Dynamic Features for Facial Action Units and Facial Expression Recognition. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 1–6 (2007)Google Scholar
  9. 9.
    Ji, Q., Lan, P., Looney, C.: A Probabilistic Framework for Modeling and Real-time Monitoring Human Fatigue. J. IEEE Transactions on Systems, Man and Cybernetics, Part A 36(5), 862–875 (2006)CrossRefGoogle Scholar
  10. 10.
    Wiskott, L., Fellous, J.M., Kuiger, N., von der Malsburg, C.: Face Recognition by Elastic Bunch Graph Matching. J. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(7), 775–779 (1997)CrossRefGoogle Scholar
  11. 11.
    Freund, Y., Schapire, R.E.: A Decision-theoretic Generalization of On-line Learning and an Application to Boosting. J. Journal of Computer and System Sciences 55(1), 119–139 (1997)zbMATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Xiao Fan
    • 1
  • Bao-Cai Yin
    • 1
  • Yan-Feng Sun
    • 1
  1. 1.Beijing Key Laboratory of Multimedia and Intelligent Software, College of Computer, Science and TechnologyBeijing University of TechnologyBeijingChina

Personalised recommendations