Skip to main content

E.Y.E. C. U.: an Emotional eYe trackEr for Cultural heritage sUpport

  • Conference paper
  • First Online:
Book cover Empowering Organizations

Abstract

Enjoying a painting, a sculpture or, more in general, a piece of art and, at the same time, to receive all the information you need about it: in this paper, we present E.Y.E. C. U. (read “I see you”), a modular eye tracking system which supports art galleries fruition without diverting visitors attention. Every time a visitor lingers on a painting detail, a hidden camera detects her gaze and the framework beams, in real time, the related illustrative contents on the wall region around it, deeply implementing the augmented reality meaning. E.Y.E. C. U. enhances the gaze detection functionalities with an emotional analysis module: as pupil is well known to reflect the emotional arousal, we monitor its size, in order to detect radius variations. Once the visitor has completed her visit, the system summarizes the observed details and the emotional reactions in a report.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Asadifard, M., Shanbezadeh, J.: Automatic adaptive center of pupil detection using face detection and cdf analysis. In: Proceedings of the International MultiConference of Engineers and Computer Scientists, vol. 1, p. 3 (2010)

    Google Scholar 

  2. Calandra, D., Cutugno, F.: Automyde: a detector for pupil dilation in cognitive load measurement. In: Caporarello, L., Di Martino, B., Martinez, M. (eds.) Smart Organizations and Smart Artifacts. Lecture Notes in Information Systems and Organisation, vol. 7, pp. 135–147. Springer International Publishing, New York (2014)

    Google Scholar 

  3. Clark V.L., Kruse, J.A.: Clinical methods: the history, physical, and laboratory examinations. JAMA 264(21), 2808–2809 (1990). http://dx.doi.org/10.1001/jama.1990.03450210108045

    Google Scholar 

  4. D’Auria, D., Di Mauro, D., Calandra, D.M., Cutugno, F.: Caruso: interactive headphones for a dynamic 3d audio application in the cultural heritage context. In: 2014 IEEE 15th International Conference on Information Reuse and Integration (IRI), pp. 525–528 (2014)

    Google Scholar 

  5. Duchowski, A.T.: Eye Tracking Methodology: Theory and Practice. Springer, New York (2007)

    Google Scholar 

  6. Fanelli, G., Gall, J., Van Gool, L.: Real time head pose estimation with random regression forests. In: 2011 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 617–624. IEEE (2011)

    Google Scholar 

  7. Gómez, E.S., Sánchez, A.S.S.: Biomedical instrumentation to analyze pupillary responses in white-chromatic stimulation and its influence on diagnosis and surgical evaluation (2012)

    Google Scholar 

  8. Hess, E.H., Polt, J.M.: Pupil size as related to interest value of visual stimuli. Science 132, 349–350 (1960)

    Article  CAS  PubMed  ADS  Google Scholar 

  9. Jacob, R.J., Karn, K.S.: Eye tracking in human-computer interaction and usability research: ready to deliver the promises. Mind 2(3), 4 (2003)

    Google Scholar 

  10. Kahneman, D., Beatty, J.: Pupil diameter and load on memory. Science 154(3756), 1583–1585 (1966)

    Article  CAS  PubMed  ADS  Google Scholar 

  11. Kassner, M., Patera, W., Bulling, A.: Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction (April 2014). http://arxiv.org/abs/1405.0006

  12. Lowenstein, O., Loewenfeld, I.E.: The pupil. The Eye 3, 231–267 (1962)

    Google Scholar 

  13. Murphy-Chutorian, E., Trivedi, M.M.: Head pose estimation in computer vision: a survey. IEEE Trans. Pattern Anal. Mach. Intell. 31(4), 607–626 (2009)

    Article  PubMed  Google Scholar 

  14. Poole, A., Ball, L.J., Phillips, P.: In search of salience: a response-time and eye-movement analysis of bookmark recognition. In: People and Computers XVIII—Design for Life, pp. 363–378. Springer (2005)

    Google Scholar 

  15. Ruf, B., Kokiopoulou, E., Detyniecki, M.: Mobile museum guide based on fast sift recognition. In: Proceedings of the 6th International Conference on Adaptive Multimedia Retrieval: Identifying, Summarizing, and Recommending Image and Music, pp. 170–183, AMR’08. Springer, Berlin (2010)

    Google Scholar 

  16. Valenti, R., Sebe, N., Gevers, T.: Combining head pose and eye location information for gaze estimation. IEEE Trans. Image Process. 21(2), 802–815 (2012). http://www.science.uva.nl/research/publications/2012/ValentiTIP2012

    Google Scholar 

  17. Viola, P.A., Jones, M.J.: Rapid object detection using a boosted cascade of simple features. In: CVPR (1), pp. 511–518 (2001)

    Google Scholar 

Download references

Acknowledgment

Work supported by the European Community and the Italian Ministry of University and Research and EU under the PON Or.C.He.S.T.R.A. (ORganization of Cultural HEritage and Smart Tourism and Real-time Accessibility) project.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Davide Maria Calandra .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Calandra, D.M., Di Mauro, D., D’Auria, D., Cutugno, F. (2016). E.Y.E. C. U.: an Emotional eYe trackEr for Cultural heritage sUpport. In: Torre, T., Braccini, A., Spinelli, R. (eds) Empowering Organizations. Lecture Notes in Information Systems and Organisation, vol 11. Springer, Cham. https://doi.org/10.1007/978-3-319-23784-8_13

Download citation

Publish with us

Policies and ethics