E.Y.E. C. U.: an Emotional eYe trackEr for Cultural heritage sUpport

  • Davide Maria Calandra
  • Dario Di Mauro
  • Daniela D’Auria
  • Francesco Cutugno
Conference paper
Part of the Lecture Notes in Information Systems and Organisation book series (LNISO, volume 11)

Abstract

Enjoying a painting, a sculpture or, more in general, a piece of art and, at the same time, to receive all the information you need about it: in this paper, we present E.Y.E. C. U. (read “I see you”), a modular eye tracking system which supports art galleries fruition without diverting visitors attention. Every time a visitor lingers on a painting detail, a hidden camera detects her gaze and the framework beams, in real time, the related illustrative contents on the wall region around it, deeply implementing the augmented reality meaning. E.Y.E. C. U. enhances the gaze detection functionalities with an emotional analysis module: as pupil is well known to reflect the emotional arousal, we monitor its size, in order to detect radius variations. Once the visitor has completed her visit, the system summarizes the observed details and the emotional reactions in a report.

Keywords

Emotion tracking Affective computing Pupil dilatation 

References

  1. 1.
    Asadifard, M., Shanbezadeh, J.: Automatic adaptive center of pupil detection using face detection and cdf analysis. In: Proceedings of the International MultiConference of Engineers and Computer Scientists, vol. 1, p. 3 (2010)Google Scholar
  2. 2.
    Calandra, D., Cutugno, F.: Automyde: a detector for pupil dilation in cognitive load measurement. In: Caporarello, L., Di Martino, B., Martinez, M. (eds.) Smart Organizations and Smart Artifacts. Lecture Notes in Information Systems and Organisation, vol. 7, pp. 135–147. Springer International Publishing, New York (2014)Google Scholar
  3. 3.
    Clark V.L., Kruse, J.A.: Clinical methods: the history, physical, and laboratory examinations. JAMA 264(21), 2808–2809 (1990). http://dx.doi.org/10.1001/jama.1990.03450210108045
  4. 4.
    D’Auria, D., Di Mauro, D., Calandra, D.M., Cutugno, F.: Caruso: interactive headphones for a dynamic 3d audio application in the cultural heritage context. In: 2014 IEEE 15th International Conference on Information Reuse and Integration (IRI), pp. 525–528 (2014)Google Scholar
  5. 5.
    Duchowski, A.T.: Eye Tracking Methodology: Theory and Practice. Springer, New York (2007)Google Scholar
  6. 6.
    Fanelli, G., Gall, J., Van Gool, L.: Real time head pose estimation with random regression forests. In: 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 617–624. IEEE (2011)Google Scholar
  7. 7.
    Gómez, E.S., Sánchez, A.S.S.: Biomedical instrumentation to analyze pupillary responses in white-chromatic stimulation and its influence on diagnosis and surgical evaluation (2012)Google Scholar
  8. 8.
    Hess, E.H., Polt, J.M.: Pupil size as related to interest value of visual stimuli. Science 132, 349–350 (1960)CrossRefPubMedADSGoogle Scholar
  9. 9.
    Jacob, R.J., Karn, K.S.: Eye tracking in human-computer interaction and usability research: ready to deliver the promises. Mind 2(3), 4 (2003)Google Scholar
  10. 10.
    Kahneman, D., Beatty, J.: Pupil diameter and load on memory. Science 154(3756), 1583–1585 (1966)CrossRefPubMedADSGoogle Scholar
  11. 11.
    Kassner, M., Patera, W., Bulling, A.: Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction (April 2014). http://arxiv.org/abs/1405.0006
  12. 12.
    Lowenstein, O., Loewenfeld, I.E.: The pupil. The Eye 3, 231–267 (1962)Google Scholar
  13. 13.
    Murphy-Chutorian, E., Trivedi, M.M.: Head pose estimation in computer vision: a survey. IEEE Trans. Pattern Anal. Mach. Intell. 31(4), 607–626 (2009)CrossRefPubMedGoogle Scholar
  14. 14.
    Poole, A., Ball, L.J., Phillips, P.: In search of salience: a response-time and eye-movement analysis of bookmark recognition. In: People and Computers XVIII—Design for Life, pp. 363–378. Springer (2005)Google Scholar
  15. 15.
    Ruf, B., Kokiopoulou, E., Detyniecki, M.: Mobile museum guide based on fast sift recognition. In: Proceedings of the 6th International Conference on Adaptive Multimedia Retrieval: Identifying, Summarizing, and Recommending Image and Music, pp. 170–183, AMR’08. Springer, Berlin (2010)Google Scholar
  16. 16.
    Valenti, R., Sebe, N., Gevers, T.: Combining head pose and eye location information for gaze estimation. IEEE Trans. Image Process. 21(2), 802–815 (2012). http://www.science.uva.nl/research/publications/2012/ValentiTIP2012
  17. 17.
    Viola, P.A., Jones, M.J.: Rapid object detection using a boosted cascade of simple features. In: CVPR (1), pp. 511–518 (2001)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Davide Maria Calandra
    • 1
  • Dario Di Mauro
    • 1
  • Daniela D’Auria
    • 1
  • Francesco Cutugno
    • 1
  1. 1.Department of Electrical Engineering and Information Technology - DIETIUniversity of Naples “Federico II”NaplesItaly

Personalised recommendations