Capturing Student Real Time Facial Expression for More Realistic E-learning Environment

  • Asanka D. Dharmawansa
  • Katsuko T. Nakahira
  • Yoshimi Fukumura
Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 14)

Abstract

With the development of information and communications technology, the growth of E-learning is rapidly increased. Environment is one of the major factors for E-learning performance specially. More realistic E-leaning surrounding is affected to achievement of effective learning environment. Facial expressions are assumed to have a great impact on behavior and also on learning behavior. This attempts to transfer real user facial feature into virtual learning place. Real time facial feature detection system is developed and it is continually extracted facial expression of E-learner. The appropriate person in virtual learning environment who represents real user called ‘Avatar’ changes when the real user changes the face. In the virtual environment, the appropriate face changes are prepared to visible real user face data. In addition, any other persons able to visible user face data through web component to observe the face behavior of the E-learner.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Moe and Blodgett, op. cit., endnote 21, p. 229. Meister op. cit., endnote 23 in US Web Based Education Commission Report (December 2000)Google Scholar
  2. 2.
    Moe and Blodgett, op. cit., endnote 21, p. 229, Gregory, Wilson and Husman (2000)Google Scholar
  3. 3.
    Higgins, S., Hall, E., Wall, K., Woolner, P., McCaughey, C.: The Impact of School Environments: A literature review. The Centre for Learning and Teaching School of Education, Communication and Language Science, University of NewcastleGoogle Scholar
  4. 4.
    Hughes, J., Attwell, G.: A framework for the evaluation of E-learning. Paper-presented to a seminar series on Exploring Models and Partnerships for eLearning in SME’s, held in Stirling, Scotland and Brussels, Belgium (2002/2003), http://www.theknownet.com/ict_smes_seminars/papers/Hughes (retrieved February 14, 2007)
  5. 5.
    Bjur, J.J.: Auditory Icons in an Information Space. Department of Industrial Design, School of Design and Craft. Goteberg University, Sweden (1998)Google Scholar
  6. 6.
    Paynea, A.M., Stephensonb, J.E., Morrisb, W.B., Tempestb, H.G., Milehamc, A., Griffinb, D.K.: The use of an E-learning constructivist solution in workplace learning. International Journal of Industrial Ergonomics 39(3), 548–553 (2009)CrossRefGoogle Scholar
  7. 7.
    Chang, T.-Y., Chen, Y.-T.: Cooperative learning in E-learning A peer assessment of student-centered using consistent fuzzy preference. Expert Systems with Applications 36(4), 8342–8349 (2009)CrossRefGoogle Scholar
  8. 8.
    Alseid, M., Rigas, D.: Users’ views of Facial Expressions and Body Gestures in E-learning Interfaces: an Empirical Evaluation. In: SEPADS 2009 Proceedings of the 8th WSEAS International Conference on Software Engineering, Parallel and Distributed Systems, pp. 121–126 (2009)Google Scholar
  9. 9.
    Alseid, M., Rigas, D.: Empirical results for the use of facial expressions and body gestures in E-learning tools. International Journal of Computers and Communications 2(3) (2008)Google Scholar
  10. 10.
    Shan, C., Gong, S., McOwan, P.W.: Beyond Facial Expressions: Learning Human Emotion from Body Gestures. In: British Machine Vision Conference 2007, paper-276 (2007)Google Scholar
  11. 11.
    Rothkrantz, L., Datcu, D., Chiriacescu, I., Chitu, A.: Assessment of the emotiona states of students during E-learning. In: International Conference on E-learning and the Knowledge Society - E-learning (2009) ISBN:1313-9207Google Scholar
  12. 12.
    Govindasamy, T.: Successful implementation of E-learning Pedagogical considerations. The Internet and Higher Education 4, 287–299 (2001)CrossRefGoogle Scholar
  13. 13.
    Viola, P., Jones, M.J.: Robust real-time object detection. International Journal of Computer Vision 57(2), 137–154 (2004)CrossRefGoogle Scholar
  14. 14.
    Nikolaidis, A., Pitas, I.: Facial feature extraction and pose determination. Pattern Recognition 33, 1783–1791 (2000)CrossRefGoogle Scholar
  15. 15.
    Matsumoto, D., Ekman, P.: Facial expression analysis. Scholarpedia 3(5), 4237 (2008)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Asanka D. Dharmawansa
    • 1
  • Katsuko T. Nakahira
    • 1
  • Yoshimi Fukumura
    • 1
  1. 1.Dept. of Management and Information Systems EngineeringNagaoka University of TechnologyNagaokaJapan

Personalised recommendations