Skip to main content

Does task-irrelevant music affect gaze allocation during real-world scene viewing?

Abstract

Gaze control manifests from a dynamic integration of visual and auditory information, with sound providing important cues for how a viewer should behave. Some past research suggests that music, even if entirely irrelevant to the current task demands, may also sway the timing and frequency of fixations. The current work sought to further assess this idea as well as investigate whether task-irrelevant music could also impact how gaze is spatially allocated. In preparation for a later memory test, participants studied pictures of urban scenes in silence or while simultaneously listening to one of two types of music. Eye tracking was recorded, and nine gaze behaviors were measured to characterize the temporal and spatial aspects of gaze control. Findings showed that while these gaze behaviors changed over the course of viewing, music had no impact. Participants in the music conditions, however, did show better memory performance than those who studied in silence. These findings are discussed within theories of multimodal gaze control.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Notes

  1. 1.

    Following the experiment, participants in the music conditions were asked to self-report the extent of their formal training in classical music; 60% reported no experience, 22% reported less than 3 years, and 18% reported more than 3 years.

  2. 2.

    Measures of frequency and duration are not orthogonal. For example, within a given amount of time, an increase in fixation durations should correlate with a decrease in the number of fixations observed. The measures do, however, characterize gaze in different ways and consistency across non-orthogonal measures enables stronger conclusions to be drawn from the data.

  3. 3.

    Although we discuss one model of salience in depth, we also considered a second model to ensure our conclusions were not biased by specific modeling choices. The Graph Based Visual Salience model (GBVS; Harel et al., 2007) computes, and then combines, multiscale feature maps (i.e., intensity, color, and orientation) via linear center-surround computations that mimic human visual receptive fields. The GBVS also promotes higher saliency values in the center of the image to account for observers’ tendency to allocate fixations toward the center of static images. The results obtained with both AWS and GBVS were entirely consistent.

  4. 4.

    Collapsing the classical and modern-classical conditions into a single group and conducting a 2 (no music vs. music) × 15 (time intervals) ANOVA yielded consistent results.

  5. 5.

    In follow-up questionnaires, participants in the no music rated the task as more “boring” (4.29 on a 7-point Likert scale where higher ratings indicate higher levels of boredom) than participants in the classical music (3.41) and modern-classical music (3.65) conditions, F(2, 66) = 3.14, p = .050. While this difference was not associated with gaze, it is possible that differing levels of experienced boredom underlies the differences in memory performance.

References

  1. Auer, K., Vitouch, O., Koreimann, S., Pesjak, G., Leitner, G., & Hitz, M. (2012). When music drives vision: Influences of film music on viewers’ eye movements. Proceedings of the 12th International Conference on Music Perception and Cognition, 73–76.

  2. Batten, J., & Smith, T. J. (2018). Looking at sound: Sound design and the audiovisual influences on gaze. In T. Dwyer, C. Perkins, S. Redmond, & J. Sita (Eds.), Seeing into screens: Eye tracking and the moving image. Bloomsbury.

  3. Becker, R. A., Chambers, J. M., & Wilks, A. R. (1988) The new S language. Wadsworth & Brooks/Cole.

  4. Berto, R., Massaccesi, S., & Pasini, M. (2008). Do eye movements measured across high and low fascination photographs differ? Addressing Kaplan's fascination hypothesis. Journal of Environmental Psychology, 28, 185–191.

    Article  Google Scholar 

  5. Borji, A., & Itti, L. (2013). State-of-the-Art in Visual Attention Modeling. Pattern Analysis and Machine Intelligence, IEEE Transactions on Pattern Analysis and Machine Intelligence, 35, 185–207.

    Article  Google Scholar 

  6. Cassidy, G. G., & MacDonald, R. A. R. (2010). The effects of music on time perception and performance of a driving game. Scandinavian Journal of Psychology, 51, 455–464.

    PubMed  Article  PubMed Central  Google Scholar 

  7. Cohen, A. J. (2001). Music as a source of emotion in film. In P. Juslin & J. Sloboda (Eds.), Music and emotion: Theory and research (pp. 249–272).

  8. Cohen, A.J. (2014). Film music from the perspective of cognitive science. In D. Neumeyer (Ed.), The Oxford handbook of film music studies. Oxford University Press.

  9. Colonius, H., & Arndt, P. (2001). A two-stage model for visual-auditory interaction in saccadic latencies. Perception & Psychophysics, 63, 126–147.

    Article  Google Scholar 

  10. Corneil, B. D., Van Wanrooij, M., Munoz, D. P., & Van Opstal, A. J. (2002). Auditory- visual interactions subserving goal-directed saccades in a complex scene. Journal of Neurophysiology, 88, 438–454.

    PubMed  Article  PubMed Central  Google Scholar 

  11. Coutrot, A., Guyander, N., Ionescu, G., & Caplier, A. (2012). Influence of soundtrack on eye movements during video exploration. Journal of Eye Movement Research, 5(2), 1–10.

    Google Scholar 

  12. Day, R.-F., Lin, C.-H., Huang, W.-H., & Chuang, S.-H. (2009). Effects of music tempo and task difficulty on multi-attribute decision-making: An eye-tracking approach. Computers in Human Behavior, 25, 120–143.

    Article  Google Scholar 

  13. Di Stasi, L. L., Marchitto, M., Antoli, A., & Canas, J. J. (2013). Saccade peak velocity as an alternative index of operator attention: Short review. European Review of Applied Psychology, 63, 335–343.

    Article  Google Scholar 

  14. Einhäuser, W., & Nuthmann, A. (2016). Salient in space, salient in time: Fixation probability predicts fixation duration during natural scene viewing. Journal of Vision, 16, 13–13.

    PubMed  Article  PubMed Central  Google Scholar 

  15. Faber, M., Krasich, K., Bixler, R. E., Brockmole, J. R., & D’Mello, S. K. (2020). The eye–mind wandering link: Identifying gaze indices of mind wandering across tasks. Journal of Experimental Psychology: Human Perception and Performance, 46, 1201.

    PubMed  PubMed Central  Google Scholar 

  16. Fachner, J. (2011). Time is the key - music and altered states of consciousness. In E. Cardenas, M. Winkelmann, C. Tart, & S. Krippner (Eds.), Altering consciousness: A multidisciplinary perspective. Vol. 1: History, culture and the humanities (pp. 355–376). Praeger.

  17. Franěk, M., Šefara, D., Petružálek, J., Cabal, J., & Myška, K. (2018). Differences in eye movements while viewing images with various levels of restorativeness. Journal of Environmental Psychology, 57, 10–16.

    Article  Google Scholar 

  18. Franěk, M., Šefara. D., Petružálek, J., Mlejnek, R., & van Noorden, L. (2018). Eye movements in scene perception while listening to slow and fast music. Journal of Eye Movement Research, 11(2), 8.

  19. Frens, M. A., Van Opstal, A. J., & Van der Willigen, R. F. (1995). Spatial and temporal factors determine auditory-visual interactions in human saccadic eye movements. Perception & Psychophysics, 57, 802–816.

    Article  Google Scholar 

  20. Garcia-Diaz, A., Leboran, V., Fdez-Vidal, X. R., & Pardo, X. M. (2012). On the relationship between optical variability, visual saliency, and eye fixations: A computational approach. Journal of Vision, 12, 17–17.

    PubMed  Article  PubMed Central  Google Scholar 

  21. Harel, J., Koch, C., & Perona, P. (2007). Graph-based visual saliency. In B. Schölkopf, J. Platt, & T. Hofmann (Eds.), Advances in neural information processing systems (pp. 545–552). MIT Press. https://doi.org/10.7551/mitpress/7503.001.0001

  22. Hayes, T. R., & Henderson, J. M. (2019). Scene semantics involuntarily guide attention during visual search. Psychonomic Bulletin & Review, 26, 1683–1689.

    Article  Google Scholar 

  23. Henderson, J. M., & Hayes, T. R. (2017). Meaning-based guidance of attention in scenes as revealed by meaning maps. Nature Human Behaviour, 1, 743.

    PubMed  PubMed Central  Article  Google Scholar 

  24. Henderson, J. M., & Hayes, T. R. (2018). Meaning guides attention in real-world scene images: Evidence from eye movements and meaning maps. Journal of Vision, 18, 10–10.

    PubMed  PubMed Central  Article  Google Scholar 

  25. Herbert, R. (2011). Music listening: Absorption, dissociation and trancing. Ashgate.

  26. Herbert, R. (2012). Musical and non-musical involvement in daily life: The case of absorption. Musicae Scientiae, 16, 41–66.

    Article  Google Scholar 

  27. Hollingworth A. (2006). Scene and position specificity in visual memory for objects. Journal of Experimental Psychology: Learning, Memory, and Cognition, 32, 58–69.

    PubMed  PubMed Central  Google Scholar 

  28. Hou, X., Harel, J., & Koch, C. (2012). Image signature: Highlighting sparse salient regions. IEEE Transactions on Pattern Analysis and Machine Intelligence, 34, 194–201.

    Article  Google Scholar 

  29. Jeffreys, H. (1961). Theory of Probability (3rd Ed.). Oxford University Press.

  30. Judd, T., Durand, F., & Torralba, A. (2012). A benchmark of computational models of saliency to predict human fixations. (Tech. Rep. No. MITCSAIL-TR-2012-001). Cambridge, MA: MIT Computer Science and Artificial Intelligence Laboratory.

  31. Kam, J. W., & Handy, T. C. (2013). The neurocognitive consequences of the wandering mind: a mechanistic account of sensory-motor decoupling. Frontiers in Psychology, 4, 725.

    PubMed  PubMed Central  Article  Google Scholar 

  32. Kämpfe, J., Sedlmeier, P., & Renkewitz, F. (2011). The impact of background music on adult listeners: A meta-analysis. Psychology of Music, 39, 424–448.

    Article  Google Scholar 

  33. Koelsch, S., Bashevkin, T., Kristensen, J., Tvedt, J., & Jenstschke, S. (2019). Heroic music stimulates empowering thoughts during mind-wandering. Scientific Reports, 9, Article 10317.

    PubMed  Article  PubMed Central  Google Scholar 

  34. Krasich, K., McManus, R., Hutt, S., Faber, M., D'Mello, S. K., & Brockmole, J. R. (2018). Gaze-based signatures of mind wandering during real-world scene processing. Journal of Experimental Psychology: General, 147, 1111–1124.

    Article  Google Scholar 

  35. Land, M. F., & Lee, D. N. (1994). Where we look when we steer. Nature, 369 (6483), 742–744.

    PubMed  Article  PubMed Central  Google Scholar 

  36. Mera, M., & Stumpf, S. (2014). Eye-tracking film music. Music and the Moving Image, 7, 3–23.

    Article  Google Scholar 

  37. Morey, R. D., Romeijn, J. W., & Rouder, J. N. (2016). The philosophy of Bayes factors and the quantification of statistical evidence. Journal of Mathematical Psychology, 72, 6–18.

    Article  Google Scholar 

  38. Murray, S., Krasich, K., Schooler, J. W., & Seli, P. (2020). What’s in a task? Complications in the study of the task-unrelated-thought variety of mind wandering. Perspectives on Psychological Science, 15(3), 572–588.

  39. Najemnik, J., & Geisler, W. S. (2005). Optimal eye movement strategies in visual search. Nature, 43, 387–391. Vision Research, 10, 1286–1294.

  40. Najemnik, J., & Geisler, W. S. (2009). Simple summation rule for optimal fixation selection in visual search. Vision Research, 10, 1286–1294.

    Article  Google Scholar 

  41. Neider, M. B., & Zelinsky, G. J. (2006). Scene context guides eye movements during visual search. Vision Research. 46, 614–621.

    PubMed  Article  PubMed Central  Google Scholar 

  42. Olshausen, B. A., & Field, D. J. (2005). How close are we to understanding V1?. Neural Computation, 17, 1665–1699.

    PubMed  Article  PubMed Central  Google Scholar 

  43. Parkhurst, D. J., Law, K., & Niebur, E. (2002). Modeling the role of salience in the allocation of overt visual attention. Vision Research, 42, 107–123.

    PubMed  Article  PubMed Central  Google Scholar 

  44. Quigley, C., Onat, S., Harding, S., Cooke, M., & Konig, P. (2008). Audio-visual integration during overt visual attention. Journal of Eye Movement Research, 1(4), 1–17.

    Google Scholar 

  45. Raftery, A. E. (1995). Bayesian model selection in social research. In P. V. Marsden (Ed.), Sociological Methodology 1995 (pp. 111–196). Blackwell.

  46. Ranti, C., Jones, W., Klin, A., & Shultz, S. (2020). Blink rate patterns provide a reliable measure of individual engagement with scene content. Scientific Reports, 10, Article 8267.

    PubMed  Article  PubMed Central  Google Scholar 

  47. Reichle, E. D., Reineberg, A. E., & Schooler, J. W. (2010). Eye movements during mindless reading. Psychological Science, 21, 1300 –1310.

    PubMed  Article  PubMed Central  Google Scholar 

  48. Riche, N., Mancas, M., Duvinage, M., Mibulumukini, M., Gosselin, B., & Dutoit, T. (2013). RARE2012: A multi-scale rarity-based saliency detection with its comparative statistical analysis. Signal Processing: Image Communication, 28, 642–658.

    Google Scholar 

  49. Schäfer, T., & Fachner, J. (2015). Listening to music reduces eye movements. Attention, Perception, & Psychophysics, 77, 551–559.

    Article  Google Scholar 

  50. Schooler, J. W., Smallwood, J., Christoff, K., Handy, T. C., Reichle, E. D., & Sayette, M. A. (2011). Meta-awareness, perceptual decoupling and the wandering mind. Trends in cognitive sciences, 15, 319–326.

    PubMed  PubMed Central  Google Scholar 

  51. Shinoda, H., Hayhoe, M. M., & Shrivastava, A. (2001). What controls attention in natural environments? Vision Research, 41, 3535–3545.

    PubMed  Article  PubMed Central  Google Scholar 

  52. Shultz, S., Klin, A., & Jones, W. (2011). Inhibition of eye blinking reveals subjective perceptions of stimulus salience. Proceedings of the National Academy of Sciences, 108, 21270–21275.

    Article  Google Scholar 

  53. Smallwood, J. (2013). Distinguishing how from why the mind wanders: a process–occurrence framework for self-generated mental activity. Psychological bulletin, 139, 519.

    PubMed  Article  PubMed Central  Google Scholar 

  54. Smallwood, J., & Schooler, J. W. (2006). The restless mind. Psychological Bulletin, 132, 946.

    PubMed  Article  PubMed Central  Google Scholar 

  55. Smilek, D., Carriere, J. S., & Cheyne, J. A. (2010). Out of mind, out of sight: Eye blinking as indicator and embodiment of mind wandering. Psychological Science, 21, 786 –789

    PubMed  Article  PubMed Central  Google Scholar 

  56. Taruffi, L., Pehrs, C., Skouras, S., & Koelsch, S. (2017). Effects of sad and happy music on mind-wandering and the default mode network. Scientific Reports, 7, Article 14396.

    PubMed  Article  PubMed Central  Google Scholar 

  57. Tatler, B. W., Brockmole, J. R., & Carpenter, R. H. S. (2017). LATEST: A Model of Saccadic Decisions in Space and Time. Psychological Review, 124, 267–300.

    PubMed  Article  PubMed Central  Google Scholar 

  58. Torralba, A., Oliva, A., Castelhano, M. S., & Henderson, J. M. (2006). Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. Psychological Review, 113, 766.

    PubMed  Article  PubMed Central  Google Scholar 

  59. Ünal, A. B., Steg, L., & Epstude, K. (2012). The influence of music on mental effort and driving performance. Accident Analysis & Prevention, 48, 271–278.

    Article  Google Scholar 

  60. Uzzaman, S., & Joordens, S. (2011). The eyes know what you are thinking: Eye movements as an objective measure of mind wandering. Consciousness and Cognition, 20, 1882–1886.

    PubMed  Article  PubMed Central  Google Scholar 

  61. Valtchanov, D., & Ellard, C. G., (2015). Cognitive and affective responses to natural scenes: Effects of low level visual properties on preference, cognitive load, and eye-movements. Journal of Environmental Psychology, 43, 184–195.

    Article  Google Scholar 

  62. Võ, M. L. H., & Henderson, J. M. (2009). Does gravity matter? Effects of semantic and syntactic inconsistencies on the allocation of attention during scene perception. Journal of Vision, 9, 24–24.

    PubMed  Article  PubMed Central  Google Scholar 

  63. Wallengren, A.-K., & Strukelj, A. (2015). Film music and visual attention: A pilot experiment using eye-tracking. Music and the Moving Image, 8, 69–80.

    Article  Google Scholar 

  64. Walther, D. & Koch, C. (2006). Modeling attention to salient proto-objects. Neural Networks, 19, 1395–1407.

    PubMed  Article  PubMed Central  Google Scholar 

  65. Wetzels, R., Matzke, D., Lee, M. D., Rouder, J. N., Iverson, G. J., & Wagenmakers, E. J. (2011). Statistical evidence in experimental psychology: An empirical comparison using 855 t tests. Perspectives on Psychological Science, 6, 291–298.

    PubMed  Article  PubMed Central  Google Scholar 

  66. Yarbus, A. L. (1967). Eye movements during perception of complex objects. Plenum.

  67. Zhang, H., Anderson, N. C., & Miller, K. F. (2020). Refixation patterns of mind-wandering during real-world scene perception. Journal of Experimental Psychology: Human Perception and Performance, 47(1), 36–52. https://doi.org/10.1037/xhp0000877

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Kristina Krasich.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix 1

Fig. 6
figure6

The complete stimulus set. White boxes indicate the vignettes taken from each of the full scene photographs

Appendix 2

All analyses were conducted with the software package JASP (Version 0.14.1) using default priors. The models are ordered by their predictive performance in reference to the best model.

Fixation count
Models P(M) P(M|data) BFM BF01 error %
Time .20 .784 14.54 1.00  
Time + Condition .20 .216 1.10 3.64 8.14
Time + Condition + Time ×  Condition .20 2.75 e −4 .00 2847.14 48.51
Null model (incl. subject) .20 3.30 e −140 1.32 e −139 2.38 e +139 0.75
Condition .20 6.25 e −141 2.50 e −140 1.25 e +140 3.58
  1. Note. All models include subject; P(M) = prior model probability, P(M|data) = posterior model probability; BFM = change from prior to posterior model odds
Fixation duration
Models P(M) P(M|data) BF M BF 01 error %
Time .20 .87 26.20 1.00  
Time + Condition .20 .132 .61 6.56 6.25
Time + Condition + Time ×  Condition .20 1.64 e −4 6.56 e −4 5289.12 38.03
Null model (incl. subject) .20 5.91 e −46 2.37 e −45 1.47 e +45 0.34
Condition .20 8.49 e −47 3.39 e −46 1.02 e +46 3.47
  1. Note. All models include subject; P(M) = prior model probability, P(M|data) = posterior model probability; BFM = change from prior to posterior model odds
Saccade count
Models P(M) P(M|data) BF M BF 01 error %
Time .20 0.78 13.881 1.000  
Time + Condition .20 0.22 1.146 3.485 7.334
Time + Condition + Time ×  Condition .20 9.636 e −4 0.004 805.622 41.868
Null model (incl. subject) .20 2.361e −94 9.443e −94 3.288e +93 0.266
Condition .20 5.157e −95 2.063e −94 1.505e +94 3.498
  1. Note. All models include subject; P(M) = prior model probability, P(M|data) = posterior model probability; BFM = change from prior to posterior model odds
Saccade duration
Models P(M) P(M|data) BF M BF 01 error %
Time .20 .781 14.29 1.00  
Time + Condition .20 .218 1.11 3.59 8.95
Time + Condition + Time ×  Condition .20 8.96 e −4 .00 871.89 48.94
Null model (incl. subject) .20 1.11 e −6 4.45 e −6 702396.67 .20
Condition .20 3.06 e −7 1.22 e −6 2.56 e +6 3.65
  1. Note. All models include subject; P(M) = prior model probability, P(M|data) = posterior model probability; BFM = change from prior to posterior model odds
Saccade amplitude
Models P(M) P(M|data) BF M BF 01 error %
Time .20 .60 5.93 1.00  
Time + Condition .20 .40 2.63 1.51 4.57
Time + Condition + Time ×  Condition .20 .01 .03 93.24 1.64
Null model (incl. subject) .20 4.48 e −46 1.79 e −45 1.33 e +45 .30
Condition .20 2.58 e −46 1.03 e −45 2.32 e +45 .97
  1. Note. All models include subject; P(M) = prior model probability, P(M|data) = posterior model probability; BFM = change from prior to posterior model odds
Fixation dispersion
Models P(M) P(M|data) BF M BF 01 error %
Null model (incl. subject) .20 .96 100.54 1.00  
Condition .20 .04 .16 25.45 1.14
Time .20 4.55 e −4 .00 2114.83 .26
Time + Condition .20 1.77 e −5 7.09 e −5 54290.23 .79
Time + Condition + Time ×  Condition .20 5.72 e −8 2.29 e −7 1.68 e +7 .90
  1. Note. All models include subject; P(M) = prior model probability, P(M|data) = posterior model probability; BFM = change from prior to posterior model odds
Visual salience
Models P(M) P(M|data) BF M BF 01 error %
Time .20 .87 27.52 1.00  
Time + Condition .20 .12 0.56 7.08 .55
Time + Condition + Time ×  Condition .20 .00 0.01 249.65 .59
Null model (incl. subject) .20 9.79 e −17 3.91 e −16 8.92 e +15 .23
Condition .20 1.27 e −17 5.08 e −17 6.87 e +16 .49
  1. Note. All models include subject; P(M) = prior model probability, P(M|data) = posterior model probability; BFM = change from prior to posterior model odds
Semantic interest
Models P(M) P(M|data) BF M BF 01 error %
Time .20 .89 24.07 1.00  
Time + Condition .20 .14 .66 6.02 6.67
Time + Condition + Time ×  Condition .20 1.10 e −4 4.40 e −4 7793.12 49.80
Null model (incl. subject) .20 4.44 e −77 1.78 e −76 1.9 e +76 .28
Condition .20 5.19 e −78 2.08 e −77 1.65 e +77 1.03
  1. Note. All models include subject; P(M) = prior model probability, P(M|data) = posterior model probability; BFM = change from prior to posterior model odds
Semantic meaning
Models P(M) P(M|data) BF M BF 01 error %
Time .20 .87 27.29 1.00  
Time + Condition .20 .13 0.57 6.99 10.29
Time + Condition + Time ×  Condition .20 .00 .01 539.69 0.24
Null model (incl. subject) .20 .00 .01 730.05 31.86
Condition .20 2.45 e −4 9.81 e −4 3555.88 3.50
  1. Note. All models include subject; P(M) = prior model probability, P(M|data) = posterior model probability; BFM = change from prior to posterior model odds
Blink rate
Models P(M) P(M|data) BF M BF 01 error %
Time .20 .72 10.29 1.00  
Time + Condition .20 .28 1.55 2.58 7.48
Time + Condition + Time ×  Condition .20 3.14 e −4 .00 2296.95 41.97
Null model (incl. subject) .20 7.81 e −10 3.12 e −9 9.23 e +8 .19
Condition .20 2.70 e −10 1.08 e −9 2.67 e +9 3.60
  1. Note. All models include subject; P(M) = prior model probability, P(M|data) = posterior model probability; BFM = change from prior to posterior model odds
Saccade peak velocity
Models P(M) P(M|data) BF M BF 01 error %
Time .20 .526 4.44 1.00  
Time + Condition .20 .450 3.28 1.17 6.75
Time + Condition + Time ×  Condition .20 .023 .10 22.50 7.29
Null model (incl. subject) .20 1.58 e −39 6.31 e −39 3.34 e +38 7.25
Condition .20 1.40 e −39 5.59 e −39 3.77 e +38 6.75
  1. Note. All models include subject; P(M) = prior model probability, P(M|data) = posterior model probability; BFM = change from prior to posterior model odds

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Krasich, K., Kim, J., Huffman, G. et al. Does task-irrelevant music affect gaze allocation during real-world scene viewing?. Psychon Bull Rev (2021). https://doi.org/10.3758/s13423-021-01947-4

Download citation

Keywords

  • Gaze control
  • Music
  • Visual salience
  • Semantic informativeness
  • Eye tracking