Psychonomic Bulletin & Review

, Volume 22, Issue 1, pp 163–169 | Cite as

Melody recognition revisited: influence of melodic Gestalt on the encoding of relational pitch information

  • Yune-Sang Lee
  • Petr Janata
  • Carlton Frost
  • Zachary Martinez
  • Richard Granger
Brief Report

Abstract

Melody recognition entails the encoding of pitch intervals between successive notes. While it has been shown that a whole melodic sequence is better encoded than the sum of its constituent intervals, the underlying reasons have remained opaque. Here, we compared listeners’ accuracy in encoding the relative pitch distance between two notes (for example, C, E) of an interval to listeners accuracy under the following three modifications: (1) doubling the duration of each note (C – E –), (2) repetition of each note (C, C, E, E), and (3) adding a preceding note (G, C, E). Repeating (2) or adding an extra note (3) improved encoding of relative pitch distance when the melodic sequences were transposed to other keys, but lengthening the duration (1) did not improve encoding relative to the standard two-note interval sequences. Crucially, encoding accuracy was higher with the four-note sequences than with long two-note sequences despite the fact that sensory (pitch) information was held constant. We interpret the results to show that re-forming the Gestalts of two-note intervals into two-note “melodies” results in more accurate encoding of relational pitch information due to a richer structural context in which to embed the interval.

Keywords

Music Melody Gestalt Interval Pitch Recognition 

References

  1. Bates, D., Maechler, M., Bolker. B., Walker S. (2013). lme4: Linear mixed-effects models using Eigen and S4. R package version 1.0-5. http://CRAN.R-project.org/package=lme4
  2. Bigand, E., Poulin, B., Tillmann, B., Madurell, F., & D'Adamo, D. A. (2003). Sensory versus cognitive components in harmonic priming. Journal of Experimental Psychology-Human Perception and Performance, 29(1), 159–171. doi:10.1037/0096-1523.29.1.159 PubMedCrossRefGoogle Scholar
  3. Collins, T., Tillmann, B., Barrett, F. S., Delbe, C., & Janata, P. (2014). A combined model of sensory and cognitive representations underlying tonal expectations in music: From audio signals to behavior. Psychological Review, 121(1), 33–65. doi:10.1037/a0034695 PubMedCrossRefGoogle Scholar
  4. Cuddy, L. L., & Cohen, A. J. (1976). Recognition of transposed melodic sequences. Quarterly Journal of Experimental Psychology, 28(May), 255–270.CrossRefGoogle Scholar
  5. Deutsch, D. (1969). Music recognition. Psychological Review, 76(3), 300–307.PubMedCrossRefGoogle Scholar
  6. Dowling, W. J. (1986). Context effects on melody recognition - scale-step versus interval representations. Music Perception, 3(3), 281–296.CrossRefGoogle Scholar
  7. Dowling, W. J., Bartlett, J. C., Halpern, A. R., & Andrews, M. W. (2008). Melody recognition at fast and slow tempos: Effects of age, experience, and familiarity. Perception & Psychophysics, 70(3), 496–502. doi:10.3758/Pp.70.3.496 CrossRefGoogle Scholar
  8. Edworthy, J. (1985). Interval and contour in melody processing. Music Perception, 2(3), 375–388.CrossRefGoogle Scholar
  9. Enns, J. T., & Prinzmetal, W. (1984). The role of redundancy in the object-line effect. Perception & Psychophysics, 35(1), 22–32.CrossRefGoogle Scholar
  10. Jaeger, T. F. (2008). Categorical data analysis: Away from ANOVAs (transformation or not) and towards Logit Mixed Models. Journal of Memory and Language, 59(4), 434–446. doi:10.1016/j.jml.2007.11.007 PubMedCentralPubMedCrossRefGoogle Scholar
  11. Jones, M. R. (1976). Time, our lost dimension - toward a new theory of perception, attention, and memory. Psychological Review, 83(5), 323–355.PubMedCrossRefGoogle Scholar
  12. Jones, M. R., Moynihan, H., MacKenzie, N., & Puente, J. (2002). Temporal aspects of stimulus-driven attending in dynamic arrays. Psychological Science, 13(4), 313–319.PubMedCrossRefGoogle Scholar
  13. Koelsch, S., & Siebel, W. A. (2005). Towards a neural basis of music perception. Trends in Cognitive Sciences, 9(12), 578–584. doi:10.1016/j.tics.2005.10.001 PubMedCrossRefGoogle Scholar
  14. Krumhansl, C. L. (1990). Cognitive foundations of musical pitch. New York: Oxford University Press.Google Scholar
  15. Kubovy, M., & Van Valkenburg, D. (2001). Auditory and visual objects. [Research Support, U.S. Gov't, P.H.S. Review]. Cognition, 80(1-2), 97–126.PubMedCrossRefGoogle Scholar
  16. Leman, M. (2000). An auditory model of the role of short-term memory in probe-tone ratings. Music Perception, 17(4), 481–509.CrossRefGoogle Scholar
  17. Mcclelland, J. L. (1978). Perception and masking of wholes and parts. Journal of Experimental Psychology-Human Perception and Performance, 4(2), 210–223.PubMedCrossRefGoogle Scholar
  18. Neuhaus, C., & Knösche, T. R. (2006). Processing of rhythmic and melodic gestalts—An ERP study. Music Perception, 24(2), 209–222. doi:10.1525/Mp.2006.24.2.209
  19. Plantinga, J., & Trainor, L. J. (2005). Memory for melody: Infants use a relative pitch code. Cognition, 98(1), 1–11. doi:10.1016/J.Cognition.2004.09.008 PubMedCrossRefGoogle Scholar
  20. Richler, J. J., Cheung, O. S., & Gauthier, I. (2011). Holistic processing predicts face recognition. [Research Support, N.I.H., Extramural Research Support, U.S. Gov't, Non-P.H.S.]. Psychological Science, 22(4), 464–471. doi:10.1177/0956797611401753 PubMedCentralPubMedCrossRefGoogle Scholar
  21. Schindler, A., Herdener, M., & Bartels, A. (2012). Coding of melodic Gestalt in human auditory cortex. Cerebral Cortex. doi:10.1093/cercor/bhs289 PubMedCentralPubMedGoogle Scholar
  22. Stewart, L., Overath, T., Warren, J. D., Foxton, J. M., & Griffiths, T. D. (2008). fMRI evidence for a cortical hierarchy of pitch pattern processing. [Research Support, Non-U.S. Gov't]. PloS One, 3(1), e1470. doi:10.1371/journal.pone.0001470 PubMedCentralPubMedCrossRefGoogle Scholar
  23. Trainor, L. J., McDonald, K. L., & Alain, C. (2002). Automatic and controlled processing of melodic contour and interval information measured by electrical brain activity. Journal of Cognitive Neuroscience, 14(3), 430–442. doi:10.1162/089892902317361949 PubMedCrossRefGoogle Scholar
  24. Warrier, C. M., & Zatorre, R. J. (2002). Influence of tonal context and timbral variation on perception of pitch. [Research Support, Non-U.S. Gov't]. Perception & Psychophysics, 64(2), 198–207.CrossRefGoogle Scholar
  25. Williams, A., & Weisstein, N. (1978). Line segments are perceived better in a coherent context than alone - object-line effect in visual-perception. Memory & Cognition, 6(2), 85–90.CrossRefGoogle Scholar
  26. Winter, B. (2013). Linear models and linear mixed effects models in R with linguistic applications. arXiv:1308.5499. [http://arxiv.org/pdf/1308.5499.pdf]

Copyright information

© Psychonomic Society, Inc. 2014

Authors and Affiliations

  • Yune-Sang Lee
    • 1
    • 3
  • Petr Janata
    • 2
  • Carlton Frost
    • 1
  • Zachary Martinez
    • 1
  • Richard Granger
    • 1
  1. 1.Department of Psychological and Brain SciencesDartmouth CollegeHanoverUSA
  2. 2.Center for Mind and BrainUniversity of CaliforniaDavisUSA
  3. 3.Department of NeurologyUniversity of PennsylvaniaPhiladelphiaUSA

Personalised recommendations