Automatic Analysis of Eye-Tracking Data for Augmented Reality Applications: A Prospective Outlook

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9769)


Eye-tracking technology is becoming easier and cheaper to use, resulting in its increasing application to numerous fields of research. Recent years have seen rapid developments in this area. In light of the foregoing, in the context of Cultural Heritage (CH), the definition of a modern approach to understand how individuals perceive art is challenging. Despite the art perception is highly subjective and variable according to knowledge and experience, more recently, several scientific study and enterprises started to quantify how subjects observe art by the application of the eye-tracking technology. The aim of this study was to understand the visual behaviour of subjects looking at paintings, using eye-tracking technology, in order to define a protocol for optimizing an existing Augmented Reality (AR) application that allows the visualization of digital contents through a display. The stimuli used are three famous paintings preserved at the National Gallery of Marche (Urbino, Marche Region, Italy). We applied eye-tracking to have a deeper understanding of people visual activities in front of these paintings and to analyse how digital contents eventually influence their behaviour. The description of the applied procedure and the preliminary results are presented.


Augmented Reality Museums Eye-tracking Behavioural analysis Mobile 



We thank our colleagues Ramona Quattrini and Paolo Clini from DICEA Department who provided expertise and materials that greatly assisted the research. We also thank Jacopo Di Girolamo for help in eye-tracking data collection.


  1. 1.
    Cameron, F., Kenderdine, S.: Theorizing Digital Cultural Heritage: A Critical Discourse (Media in Transition). The MIT Press, Cambridge (2007)CrossRefGoogle Scholar
  2. 2.
    Chatterjee, A., Widick, P., Sternschein, R., Smith, W.B., Bromberger, B.: The assessment of art attributes. Empir. Stud. Arts 28(2), 207–222 (2010)CrossRefGoogle Scholar
  3. 3.
    Clini, P., Frontoni, E., Quattrini, R., Pierdicca, R.: Augmented reality experience: from high-resolution acquisition to real time augmented contents. Adv. Multimed. 2014, 18 (2014)CrossRefGoogle Scholar
  4. 4.
    Damala, A., Stojanovic, N.: Tailoring the adaptive augmented reality (A2R) museum visit: identifying cultural heritage professionals’ motivations and needs. In: 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-AMH), pp. 71–80. IEEE (2012)Google Scholar
  5. 5.
    Duchowski, A.T.: Acuity-matching resolution degradation through wavelet coefficient scaling. IEEE Trans. Image Process. 9(8), 1437–1440 (2000)CrossRefGoogle Scholar
  6. 6.
    Dünser, A., Hornecker, E.: Lessons from an ar book study. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction, pp. 179–182. ACM (2007)Google Scholar
  7. 7.
    Eftaxopoulos, E., Vasilakis, A., Fudos, I.: AR-TagBrowse: annotating and browsing 3D objects on mobile devices. In: Eurographics (Posters), pp. 5–6 (2014)Google Scholar
  8. 8.
    Eggert, D., Hücker, D., Paelke, V.: Augmented reality visualization of archeological data. In: Buchroithner, M., Prechtel, N., Burghardt, D. (eds.) Cartography from Pole to Pole, pp. 203–216. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  9. 9.
    Eghbal-Azar, K., Merkt, M., Bahnmueller, J., Schwan, S.: Use of digital guides in museum galleries: determinants of information selection. Comput. Hum. Behav. 57, 133–142 (2016)CrossRefGoogle Scholar
  10. 10.
    Graham, D.J., Redies, C.: Statistical regularities in art: relations with visual coding and perception. Vis. Res. 50(16), 1503–1509 (2010)CrossRefGoogle Scholar
  11. 11.
    Haugstvedt, A.-C., Krogstie, J.: Mobile augmented reality for cultural heritage: a technology acceptance study. In: ISMAR, pp. 247–255 (2012)Google Scholar
  12. 12.
    Julier, S., Bishop, G.: Guest editors’ introduction: tracking: how hard can it be? IEEE Comput. Graph. Appl. 6, 22–23 (2002)CrossRefGoogle Scholar
  13. 13.
    Kato, H., Billinghurst, M.: Marker tracking and hmd calibration for a video-based augmented reality conferencing system. In: 2nd IEEE and ACM International Workshop on Augmented Reality, 1999, (IWAR 1999), Proceedings, pp. 85–94. IEEE (1999)Google Scholar
  14. 14.
    Logothetis, N.K.: Intracortical recordings and fmri: an attempt to study operational modules and networks simultaneously. Neuroimage 62(2), 962–969 (2012)CrossRefGoogle Scholar
  15. 15.
    Lundy, D.E., Schenkel, M.B., Akrie, T.N., Walker, A.M.: How important is beauty to you? the development of the desire for aesthetics scale. Empir. Stud. Arts 28(1), 73–92 (2010)CrossRefGoogle Scholar
  16. 16.
    Massaro, D., Savazzi, F., Di Dio, C., Freedberg, D., Gallese, V., Gilli, G., Marchetti, A.: When art moves the eyes: a behavioral and eye-tracking study. PloS One 7(5), e37285 (2012)CrossRefGoogle Scholar
  17. 17.
    Pescarin, S., Wallergird, M., Hupperetz, W., Pagano, A., Ray, C.: Archeovirtual 2011: an evaluation approach to virtual museums. In: 2012 18th International Conference on Virtual Systems and Multimedia (VSMM), pp. 25–32. IEEE (2012)Google Scholar
  18. 18.
    Pierdicca, R., Frontoni, E., Zingaretti, P., Sturari, M., Clini, P., Quattrini, R.: Advanced interaction with paintings by augmented reality and high resolution visualization: a real case exhibition. In: De Paolis, L.T., Mongelli, A. (eds.) AVR 2015. LNCS, vol. 9254, pp. 38–50. Springer, Heidelberg (2015)Google Scholar
  19. 19.
    Pierdicca, R., Liciotti, D., Contigiani, M., Frontoni, E., Mancini, A., Zingaretti, P.: Low cost embedded system for increasing retail environment intelligence. In: 2015 IEEE International Conference on Multimedia and Expo Workshops (ICMEW), pp. 1–6. IEEE (2015)Google Scholar
  20. 20.
    Quiroga, R.Q., Pedreira, C.: How do we see art: an eye-tracker study. Front. Hum. Neurosci. 5, 98 (2011)CrossRefGoogle Scholar
  21. 21.
    Rothkopf, C.A., Ballard, D.H., Hayhoe, M.M.: Task and context determine where you look. J. Vis. 7(14), 16–16 (2007)CrossRefGoogle Scholar
  22. 22.
    Sturari, M., Liciotti, D., Pierdicca, R., Frontoni, E., Mancini, A., Contigiani, M., Zingaretti, P.: Robust and affordable retail customer profiling by vision and radio beacon sensor fusion. Pattern Recogn. Lett. (2016, in press)Google Scholar
  23. 23.
    Tang, H., Kreiman, G.: Face recognition: vision and emotions beyond the bubble. Curr. Biol. 21(21), R888–R890 (2011)CrossRefGoogle Scholar
  24. 24.
    Yoshimura, Y., Girardin, F., Carrascal, J.P., Ratti, C., Blat, J.: New tools for studying visitor behaviours in museums: a case study at the louvre. In: Information and Communication Technologies in Tourism 2012, Proceedings of the International Conference Helsingborg (ENTER 2012), pp. 391–402 (2012)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Department of Materials, Environmental Sciences and Urban PlanningUniversità Politecnica delle MarcheAnconaItaly
  2. 2.Department of Information EngineeringUniversità Politecnica delle MarcheAnconaItaly
  3. 3.Department of Agricultural, Food and Environmental SciencesUniversità Politecnica delle MarcheAnconaItaly

Personalised recommendations