Neural substrates of processing syntax and semantics in music
Growing evidence indicates that syntax and semantics are the basic aspects of music. After the onset of a chord, initial music-syntactic processing can be observed at about 150–400 ms and processing of musical semantics at about 300–500 ms. Processing of musical syntax activates inferior frontolateral cortex, ventrolateral premotor cortex and presumably the anterior part of the superior temporal gyrus. These brain structures have been implicated in sequencing of complex auditory information, identification of structural relationships, and serial prediction. Processing of musical semantics appears to activate posterior temporal regions. The processes and brain structures involved in the perception of syntax and semantics in music have considerable overlap with those involved in language perception, underlining intimate links between music and language in the human brain.
Unable to display preview. Download preview PDF.
- Avanzini G, Faienza C, Miniacchi D, Lopez L, Majno M (2003) The Neurosciences and Music. Annals of the New York Academy of Sciences 999. This book provides another extensive overview of the different fields investigated within neurocognition of musicGoogle Scholar
- Heinke W, Kenntner R, Gunter TC, Sammler D, Olthoff D, Koelsch S (2004) Differential effects of increasing propofol sedation on frontal and temporal cortices: an ERP study. Anesthesiology 100, 617–625. Using ERPs and music, the authors investigated the effects of sedative drugs (propofol) on auditory processing. It was found that propofol affects at lower doses processing of ‘higher’ cognitive functions located in multimodal cortex, whereas functions located in primary auditory cortical areas remain unaffectedCrossRefPubMedGoogle Scholar
- Knoesche TR, Neuhaus C, Haueisen J, Alter K, Maess B, Witte OW, Friederici AD (2005) The Perception of Phrase Structure in Music. Human Brain Mapp (in press)Google Scholar
- Koelsch S, Kasper E, Sammler D, Schulze K, Gunter TC, Friederici AD (2004) Music, language, and meaning: brain signatures of semantic processing. Nat Neurosci 7, 302–307. Using ERPs and a priming paradigm the authors show that music can activate representations of meaningful concepts. This is the first study to show that music can transfer meaningful information, and that this transfer relies on those neurophysiological processes engaged for the processing of semantics in languageCrossRefPubMedGoogle Scholar
- Koelsch S, Fritz T, Schulze K, Alsop D, Schlaug G (2005) Adults and Children Processing Music: An fMRI Study. NeuroImage (in press)Google Scholar
- Meyer LB (1956) Emotion and Meaning in Music. University of Chicago Press, ChicagoGoogle Scholar
- Pinker S (1997) How the mind works. NortonGoogle Scholar
- Zatorre RJ, Peretz I (2003) The Cognitive Neuroscience of Music. Oxford University Press. This book provides an extensive overview of the different fields investigated within neurocognition of musicGoogle Scholar