Multimedia Tools and Applications

, Volume 74, Issue 22, pp 10161–10176 | Cite as

The influence of color during continuity cuts in edited movies: an eye-tracking study

Article

Abstract

Professionally edited videos entail frequent editorial cuts – that is, abrupt image changes from one frame to another. The impact of these cuts on human eye movements is currently not well understood. In the present eye-tracking study, we experimentally gauged the degree to which color and visual continuity contributed to viewers’ eye movements following cinematic cuts. In our experiment, viewers were presented with two edited action sports movies on the same screen but they were instructed to watch and keep their gaze on only one of these movies. Crucially, the movies were frequently interrupted and continued after a short break either at the same or at switched locations. Hence, viewers needed to rapidly recognize the continuation of the relevant movie and re-orient their gaze toward it. Properties of saccadic eye movements following each interruption probed the recognition of the relevant movie after a cut. Two key findings were that (i) memory co-determines attention after cuts in edited videos, resulting in faster re-orientation toward scene continuations when visual continuity across the interruption is high than when it is low, and (ii) color contributes to the guidance of attention after cuts, but its benefit largely rests upon enhanced discrimination of relevant from irrelevant visual information rather than memory. Results are discussed with regard to previous research on eye movements in movies and recognition processes. Possible future directions of research are outlined.

Keywords

Edited videos Continuity Color Attention Eye movements Memory 

Notes

Acknowledgments

The authors thank two anonymous reviewers for their excellent and helpful feedback on a previous version of this manuscript, as well as Blerim Zeqiri and Stefan Kandioller for assistance with data collection. This research was funded by a grant from the Wiener Wissenschafts-, Forschungs- und Technologiefonds (WWTF, Vienna Science and Technology Fund), no. CS 11–009 to Ulrich Ansorge, Shelley Buchinger, and Otmar Scherzer.

Compliance with Ethical Standards

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest. All research protocols complied with the Declaration of Helsinki and APA ethical standards.

References

  1. 1.
    Ansorge U, Buchinger S, Valuch C, Patrone AR, Scherzer O (2014) Visual attention in edited dynamical images. SIGMAP: Proceedings of the 11th International Conference on Signal Processing and Multimedia Applications 2014:198–205. doi: 10.5220/0005101901980205
  2. 2.
    Böhme M, Dorr M, Krause C, Martinetz T, Barth E (2006) Eye movement predictions on natural videos. Neurocomputing 69(16):1996–2004CrossRefGoogle Scholar
  3. 3.
    Bordwell D, Thompson K (2001) Film art: an introduction. McGraw-Hill, New YorkGoogle Scholar
  4. 4.
    Brady TF, Konkle T, Alvarez GA, Oliva A (2008) Visual long-term memory has a massive storage capacity for object details. Proc Natl Acad Sci 105(38):14325–14329CrossRefGoogle Scholar
  5. 5.
    Brainard DH (1997) The psychophysics toolbox. Spat Vis 10:433–436CrossRefGoogle Scholar
  6. 6.
    Carmi R, Itti L (2006) The role of memory in guiding attention during natural vision. J Vis 6(9):4. doi:10.1167/6.9.4 CrossRefGoogle Scholar
  7. 7.
    Carmi R, Itti L (2006) Visual causes versus correlates of attentional selection in dynamic scenes. Vis Res 46(26):4333–4345CrossRefGoogle Scholar
  8. 8.
    R Core Team (2014) R: a language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. http://www.R-project.org/
  9. 9.
    Cornelissen FW, Peters EM, Palmer J (2002) The Eyelink toolbox: eye tracking with MATLAB and the psychophysics toolbox. Behav Res Methods Instrum Comput 34(4):613–617CrossRefGoogle Scholar
  10. 10.
    Cutting JE, Brunick KL, Candan A (2012) Perceiving event dynamics and parsing Hollywood films. J Exp Psychol Hum Percept Perform 38(6):1476–1490CrossRefGoogle Scholar
  11. 11.
    D’Zmura M (1991) Color in visual search. Vis Res 31(6):951–966CrossRefGoogle Scholar
  12. 12.
    Dorr M, Martinetz T, Gegenfurtner KR, Barth E (2010) Variability of eye movements when viewing dynamic natural scenes. J Vis 10(10):28CrossRefGoogle Scholar
  13. 13.
    Duncan J, Humphreys GW (1989) Visual search and stimulus similarity. Psychol Rev 96(3):433CrossRefGoogle Scholar
  14. 14.
    Foulsham T, Kingstone A (2013) Fixation-dependent memory for natural scenes: an experimental test of scanpath theory. J Exp Psychol Gen 142(1):41CrossRefGoogle Scholar
  15. 15.
    Gegenfurtner KR, Rieger J (2000) Sensory and cognitive contributions of color to the recognition of natural scenes. Curr Biol 10(13):805–808CrossRefGoogle Scholar
  16. 16.
    Hansen T, Gegenfurtner KR (2009) Independence of color and luminance edges in natural scenes. Vis Neurosci 26(1):35–49CrossRefGoogle Scholar
  17. 17.
    Hochberg J, Brooks V (1996) The perception of motion pictures. In: Carterette EC, Friedman MP (Eds) Handbook of perception and cognition, pp. 205–292Google Scholar
  18. 18.
    Huang L, Holcombe AO, Pashler H (2004) Repetition priming in visual search: episodic retrieval, not feature priming. Mem Cogn 32(1):12–20CrossRefGoogle Scholar
  19. 19.
    Kirchner H, Thorpe SJ (2006) Ultra-rapid object detection with saccadic eye movements: visual processing speed revisited. Vis Res 46(11):1762–1776CrossRefGoogle Scholar
  20. 20.
    Kristjánsson Á, Campana G (2010) Where perception meets memory: a review of repletion priming in visual search tasks. Atten Percept Psychophys 72:5–18CrossRefGoogle Scholar
  21. 21.
    Lamy DF, Kristjánsson Á (2013) Is goal-directed attentional guidance just intertrial priming? A review. J Vis 13(3):14CrossRefGoogle Scholar
  22. 22.
    Levin DT, Simons DJ (2000) Perceiving stability in a changing world: combining shots and integrating views in motion pictures and the real world. Media Psychol 2(4):357–380CrossRefGoogle Scholar
  23. 23.
    Luck SJ, Vogel EK (1997) The capacity of visual working memory for features and conjunctions. Nature 390(6657):279–281CrossRefGoogle Scholar
  24. 24.
    Magliano JP, Zacks JM (2011) The impact of continuity editing in narrative film on event segmentation. Cogn Sci 35(8):1489–1517CrossRefGoogle Scholar
  25. 25.
    Maljkovic V, Nakayama K (1994) Priming of pop-out: I. Role of features. Mem Cogn 22(6):657–672CrossRefGoogle Scholar
  26. 26.
    May J, Dean MP, Barnard PJ (2003) Using film cutting techniques in interface design. Hum Comput Interact 18:325–372CrossRefGoogle Scholar
  27. 27.
    Mollon JD (1989) “Tho’she kneel’d in that place where they grew…” The uses and origins of primate colour vision. J Exp Biol 146(1):21–38Google Scholar
  28. 28.
    Murch W, Coppola FF (1995) In the blink of an eye: a perspective on film editing. Silman-James Press, Beverly HillsGoogle Scholar
  29. 29.
    Olivers CNL, Meijer F, Theeuwes J (2006) Feature-based memory-driven attentional capture: visual working memory content affects visual attention. J Exp Psychol Hum Percept Perform 32(5):1243CrossRefGoogle Scholar
  30. 30.
    Pelli DG (1997) The VideoToolbox software for visual psychophysics: transforming numbers into movies. Spat Vis 10(4):437–442CrossRefGoogle Scholar
  31. 31.
    Reisz K, Millar G (1953) The technique of film editing. Focal Press, LondonGoogle Scholar
  32. 32.
    Rousselet GA, Fabre-Thorpe M, Thorpe SJ (2002) Parallel processing in high-level categorization of natural images. Nat Neurosci 5(7):629–630Google Scholar
  33. 33.
    Smith TJ (2012) The attentional theory of cinematic continuity. Projections 6(1):1–27CrossRefGoogle Scholar
  34. 34.
    Smith TJ, Henderson JM (2008) Edit blindness: the relationship between attention and global change blindness in dynamic scenes. J Eye Mov Res 2(2):1–17MATHGoogle Scholar
  35. 35.
    Smith TJ, Levin D, Cutting JE (2012) A window on reality: perceiving edited moving images. Curr Dir Psychol Sci 21(2):107–113CrossRefGoogle Scholar
  36. 36.
    Smith TJ, Mital PK (2013) Attentional synchrony and the influence of viewing task on gaze behavior in static and dynamic scenes. J Vis 13(8):16CrossRefGoogle Scholar
  37. 37.
    Standing L (1973) Learning 10000 pictures. Q J Exp Psychol 25(2):207–222CrossRefGoogle Scholar
  38. 38.
    Swain MJ, Ballard DH (1991) Color indexing. Int J Comput Vis 7(1):11–32CrossRefGoogle Scholar
  39. 39.
    Valuch C, Ansorge U, Buchinger S, Patrone AR, Scherzer O (2014) The effect of cinematic cuts on human attention. TVX ‘14: Proceedings of the ACM Interantional Conference on Interactive Experiences for TV and Online Video 2014:119–122. doi: 10.1145/2602299.2602307
  40. 40.
    Valuch C, Becker SI, Ansorge U (2013) Priming of fixations during recognition of natural scenes. J Vis 13(3):3CrossRefGoogle Scholar
  41. 41.
    Wichmann FA, Sharpe LT, Gegenfurtner KR (2002) The contributions of color to recognition memory for natural scenes. J Exp Psychol Learn Mem Cogn 28(3):509CrossRefGoogle Scholar
  42. 42.
    Wolfe JM, Horowitz TS (2004) What attributes guide the deployment of visual attention and how do they do it? Nat Rev Neurosci 5(6):495–501CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  1. 1.Cognitive Science Research PlatformUniversity of ViennaWienAustria
  2. 2.Faculty of PsychologyUniversity of ViennaWienAustria

Personalised recommendations