Machine Intelligence: The Neuroscience of Chordal Semantics and Its Association with Emotion Constructs and Social Demographics

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9384)


We present an extension to knowledge discovery in Music Information Retrieval (MIR) databases and the emotional indices associated with (i) various scalar theory, and (ii) correlative behavioral demographics. Certain societal demographics are set in their ways as to how they dress, behave in society, solve problems and deal with anger and other emotional states. It is also well documented that particular musical scales evoke particular states of emotion and personalities of their own. This paper extends the work that Knowledge Discovery in Databases (KDD) and Rough Set Theory has opened in terms of mathematically linking music scalar theory to emotions. We now, extend the paradigm by associating emotions, based from music, to societal demographics and how strong these relationships to music are as to affect, if at all, how one may dress, behave in society, solve problems and deal with anger and other emotional states.


Chordal semantics Computer science Emotion KDD Machine intelligence Machine learning MIR Neuro-endocrinology Neuroscience Rough set theory Social demographics 


  1. 1.
    Baraldi, F.B., Poli, G.D., Rodà, A.: Communicating expressive intentions with a single piano note. J. New Music Res. 35(3), 197–210 (2006)CrossRefGoogle Scholar
  2. 2.
    Cable Network Ranker: FNC No. 1 in Primetime for Week of March 9 (2015). Accessed 05 May 2015
  3. 3.
    Eerola, T., Ferrer, R., Alluri, V.: Timbre and affect dimensions: evidence from affect and similarity ratings and acoustic correlates of isolated instrument sounds. Music Percept. Interdisc. J. 30(1), 49–70 (2012)CrossRefGoogle Scholar
  4. 4.
    Espinoza, F.C., López-Ortega, O., Franco-Árcega, A.: Towards the automatic recommendation of musical parameters based on algorithm for extraction of linguistic rules. Computación y Sistemas 18(3), 637–647 (2014)Google Scholar
  5. 5.
    Estiri, H.: A structural equation model of energy consumption in the united states: untangling the complexity of per-capita residential energy use. Energy Res. Soc. Sci. 6, 109–120 (2015)CrossRefGoogle Scholar
  6. 6.
    Gabrielsson, A., Lindström, E.: The influence of musical structure on emotional expression (2001)Google Scholar
  7. 7.
    Grahn, J.A., Henry, M.J., McAuley, J.D.: Fmri investigation of cross-modal interactions in beat perception: audition primes vision, but not vice versa. Neuroimage 54(2), 1231–1243 (2011)CrossRefGoogle Scholar
  8. 8.
    Hevner, K.: Experimental studies of the elements of expression in music. Am. J. Psychol. bf–48, 246–268 (1936)CrossRefGoogle Scholar
  9. 9.
    Lahdelma, I., Eerola, T.: Single chords convey distinct emotional qualities to both naïve and expert listeners. Psychol. Music 39, 1–18 (2014)Google Scholar
  10. 10.
    Leman, M., Vermeulen, V., De Voogdt, L., Moelants, D., Lesaffre, M.: Prediction of musical affect using a combination of acoustic structural cues. J. New Music Res. 34(1), 39–67 (2005)CrossRefGoogle Scholar
  11. 11.
    Lewis, R., Raś, Z.: Facial recognition. In: Wang, J. (ed.) Encyclopedia of Data Warehousing and Mining, vol. II, 2nd edn, pp. 862–871. Idea Group Inc., Hershey (2008)Google Scholar
  12. 12.
    Lewis, R., Zhang, X., Raś, Z.: Knowledge discovery based identification of musical pitches and instruments in polyphonic sounds. Int. J. Eng. Appl. Artif. Intell. Special Issue on ‘Soft Computing Applications’ 20(5), 637–645 (2006)Google Scholar
  13. 13.
    Lewis, R.A., Zhang, X., Raś, Z.W.: Blind signal separation of similar pitches and instruments in a noisy polyphonic domain. In: Esposito, F., Raś, Z.W., Malerba, D., Semeraro, G. (eds.) ISMIS 2006. LNCS (LNAI), vol. 4203, pp. 228–237. Springer, Heidelberg (2006) CrossRefGoogle Scholar
  14. 14.
    Lewis, R.A., Ras, Z.W.: Rules for processing and manipulating scalar music theory. In: International Conference on Multimedia and Ubiquitous Engineering, MUE 2007, pp. 819–824. IEEE (2007)Google Scholar
  15. 15.
    Lewis, R.A., Zhang, X., Raś, Z.W.: Knowledge discovery-based identification of musical pitches and instruments in polyphonic sounds. Eng. Appl. Artif. Intell. 20(5), 637–645 (2007)CrossRefGoogle Scholar
  16. 16.
    McClellan, R.: The Healing Forces of Music. Element Inc., Rockport (1966) Google Scholar
  17. 17.
    Menon, V., Levitin, D., Smith, B.K., Lembke, A., Krasnow, B., Glazer, D., Glover, G., McAdams, S.: Neural correlates of timbre change in harmonic sounds. Neuroimage 17(4), 1742–1754 (2002)CrossRefGoogle Scholar
  18. 18.
    Cable news ratings for thursday, March 5, 2015. TVbytheNumbers (2015)Google Scholar
  19. 19.
    Pavel, I.: A hierarchical theory of aesthetic perception - musical scales in theoretical perspectives prometheus. Leonardo 27(5), 417–421 (1994)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Raś, Z.W., Zhang, X., Lewis, R.: MIRAI: multi-hierarchical, FS-tree based music information retrieval system. In: Kryszkiewicz, M., Peters, J.F., Rybiński, H., Skowron, A. (eds.) RSEISP 2007. LNCS (LNAI), vol. 4585, pp. 80–89. Springer, Heidelberg (2007) CrossRefGoogle Scholar
  21. 21.
    Schedl, M., Gómez, E., Urbano, J.: Music information retrieval: recent developments and applications. Found. Trends Inf. Retrieval 8(2–3), 127–261 (2014)CrossRefGoogle Scholar
  22. 22.
    Schutz, M., Huron, D., Keeton, K., Loewer, G.: The happy xylophone: acoustics affordances restrict an emotional palate. Empirical Musicol. Rev. 3, 126–135 (2008)Google Scholar
  23. 23.
    Sevgen, A.: The Science of Musical Sound. ScientificAmerican Books Inc., New York (1983) Google Scholar
  24. 24.
    Smith, R., Rathcke, T., Cummins, F., Overy, K., Scott, S.: Communicative rhythms in brain and behaviour. Philos. Trans. Royal Soc. B: Biol. Sci. 369(1658), 20130389 (2014)CrossRefGoogle Scholar
  25. 25.
    TVBGeneralGlossary (2015). Accessed 05 May 2015
  26. 26.
    Wieczorkowska, A., Synak, P., Lewis, R., Ras, Z.: Creating reliable database for experiments on extracting emotions from music. In: Kłopotek, M.A., Wierzchon, S.T., Trojanowski, K. (eds.) Intelligent Information Processing and Web Mining, pp. 395–402. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  27. 27.
    Zentner, M., Grandjean, D., Scherer, K.R.: Emotions evoked by the sound of music: characterization, classification, and measurement. Emotion 8(4), 494 (2008)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Department of Computer ScienceUniversity of ColoradoColorado SpringsUSA

Personalised recommendations