The establishment of eye contact between a musician and his or her listeners, and its influence on musical perception, has been neglected by musical performance research. Two experiments were conducted to verify the hypothesis that increased eye contact between musician and audience leads the latter to better appreciate the music performed by the former. In the first experiment a musician played and sang three pieces, whereas in the second study he played without vocal support. The results of both experiments showed that directing the musicians’ gaze toward the audience enhanced the qualities of the musical experience.
This is a preview of subscription content, access via your institution.
Buy single article
Instant access to the full article PDF.
Price includes VAT (USA)
Tax calculation will be finalised during checkout.
This aspect needs clarification. Both Sloboda and Juslin (2001) and Scherer and Zentner (2001) observed that in experimental research the distinction between emotion as experienced by the listener, emotion as it is believed to be experienced by the interpreter, and emotion as a quality attributed to music per se are rather fluid and that empirical findings vary according to the specific instructions given to the participants (in fact, instructions can stress differently who/what is the referent of the emotion) (see also Grewe et al., 2007). Thus, we addressed a particular attention in devising the instructions. In the instructions we tried to make clear that the first question (“Quanto ti è piaciuto il brano?”—literal translation: “How much did you like the piece of music?”) concerned a judgment about liking of the piece of music, the second series of questions (“Quanto il brano musicale ha suscitato in te ciascuna delle seguenti emozioni?”–literal translation: “To what extent did the piece of music elicit each of the following emotions in you?”) concerned emotions actually experienced by the respondent during listening, and that the third series of questions (“Quanto ritieni sia stato espressivo l’esecutore?” and “Quanto ritieni sia stato comunicativo l’esecutore?”—literal translation: “How expressive do you think the executor had been?” and “How communicative do you think the executor had been?”) concerned the performer.
It might be argued that aggregating scores recorded for each piece of music, so to have a global score for each variable (liking, joy, and so on) for each condition (no-turn, 3-turn, and 6-turn) regardless the piece of music, undermines the assumption that the three pieces were interchangeable. However, someone might argue, this is not the case, particularly for scores referring to emotional reactions; in fact the emotions that each piece of music conveys might be quite different. Four considerations can be taken into account in order to justify our aggregation of the scores recorded for each variable in each piece of music. First, in both experiments the pattern of trends and the statistical results were the same for the aggregated scores and for the each separate piece of music. Secondly, we analyzed the data using the three pieces of music as a between-subjects variable and, as such, we applied a 3 (piece of music: between-subject independent variable) × 3 (head-turning condition: within-subject independent variable) ANOVA to all dependent variables: in no case did significant interaction effects emerge. Thus, we had evidence that the effects produced by head-turning were similar in all pieces, despite possible differences in the subjective reactions elicited by the different pieces of music employed in each experiment. Thirdly, to obtain a further confirmation that the subjective reactions elicited by the three pieces of music employed in the same experiment were similar, we calculated the mean evaluations of each variable (by collapsing scores obtained in the three conditions) for each emotion and for the other variables (liking, expressiveness, communicativeness) for each piece of music, ranked from the highest to the lowest score. In both experiments, the rankings of mean evaluations of each variable for each piece of music were the same for each of the three pieces. Statistical support for this conclusion was provided by correlations between the ranks computed for each pair of musical pieces (Spearman’s ρ coefficients were as follows. Experiment 1: Just the Way You Are-Take Me Home = .95; Just the Way You Are-What a Feeling = .91; Take Me Home-What a Feeling = .88. Experiment 2: Georgia on My Mind-Misty = .86; Georgia on My Mind-Moonlight Serenade = .71; Misty-Moonlight Serenade = .91. All coefficients were statistically significant). Finally, we observed that the absolute mean scores for each variable were very similar for the three pieces (in Experiment 1, differences between the score recorded in one piece and the corresponding score recorded in the other two pieces exceed 1.0 in only 2 out of 16 cases; the same was true in Experiment 2). So, we can maintain that each piece of music, even though it had its own emotional nuances, elicited approximately the same emotional activations. This allows us to treat the aggregated scores as valid overall measures of each kind of subjective reactions to music performance investigated in this study.
The logic underlying our model testing followed both theoretical and statistical criteria. In fact, the first model we tested was inferred from our theoretical assumptions (namely, Freeman’s (2004) theoretical perspective). Since this model obtained good fit indexes only in the 6-headturn condition, we tested alternative models for the no-headturn and 3-headturn conditions where the link between expressiveness and ratings of joy, which turned out to be weak, was eliminated. However, in order to test the possibility that other models might have been a better fit than our final models, we tested all of the alternatives. In parallel we tested the difference between correlations among all dependent variables in the three conditions (no-headturn, 3-headturn, 6-headturn). Specifically, we computed Pearson-Filon statistics to test differences between correlations concerning one pair of variables and correlations concerning a second related, nonoverlapping pair of variables in the no-headturn versus 3 headturn conditions as well as in the 3-headturn versus 6-headturn conditions. We found that differences between correlations were not statistically significant in the 0- versus 3-headturn comparisons, whereas in most cases they were significant in the 3- versus 6-headturn comparisons. This impelled us to keep the same models—which were those that obtained the best fit indexes—in the 0- and 3-headturn conditions and to use a different model for the 6-headturn condition. The same procedure was followed in Experiment 2. In conclusion, we can maintain that the models we reported were those which obtained the best fit indexes.
For the RMSEA it has been suggested that values < .05 constitute good fit, values in the .05 to .08 range acceptable fit, values in the .08 to .10 marginal fit, and values > .10 poor fit (Browne & Cudeck, 1992). For the GFI and CFI values > .95 constitute good fit and values > .90 acceptable fit (Medsker et al., 1994).
Since all the pieces of music we employed elicited joy as the predominant emotion, a caution is needed: we can not be sure that the some results could be obtained with a different genre of music which elicits another emotion (for instance sadness) as the prevalent emotion.
A 3 (gaze condition: no-turn versus 3-turn versus 6-turn) × 2 (type of music: vocal versus instrumental) mixed ANOVA model was applied to test possible differences between Experiment 1 and 2 in liking, joy, expressiveness, and communicativeness scores. In all cases principal effects due to gaze were statistically significant (respectively, F = 11.07, p < .001; F = 16.32, p < .001; F = 67.86, p < .001; F = 108.21, p < .001). The type of music produced significant differences in expressiveness and communicativeness scores (respectively, F = 17.66, p < .001; F = 14.24, p < .001) but not in liking and joy scores (respectively, F = 1.19, p = .278; F = 3.65, p < .058) (the significant differences may depend on the intrinsic features of the two pieces of music, since the comparison between Table 1 and 2 makes clear that such differences emerged even in the no-turn condition). In no case significant interaction effects occurred (respectively, F = 0.49, p = .611; F = 0.16, p = .855; F = 0.61, p = .545; F = 0.36, p = .700), so leading to maintain that gaze affected the intensity of the subjective reactions to the listened pieces in a similar way in vocal and instrumental only music.
Anderson, N. R. (1991). Decision making in the graduate selection interview: An experimental investigation. Human Relations, 44(4), 403–417.
Arbuckle, J. L., & Wothke, W. (1999). Amos 4.0 user’s guide. Chicago, IL: Small Waters Corporation.
Argyle, M. (1983). The psychology of interpersonal behaviour (4th ed.). Harmondsworth: Penguin.
Bayliss, A. P., di Pellegrino, G., & Tipper, S. P. (2004). Orienting of attention via observed eye gaze is head-centred. Cognition, 94, B1–B10.
Browne, M. W., & Cudeck, R. (1992). Alternative ways of assessing model fit. Sociological Methods and Research, 21, 230–258.
Child, I. (1962). Personal preferences as an expression of aesthetic sensitivity. Journal of Personality, 30, 496–512.
Dahl, S., & Friberg, A. (2003). What can the body movements reveal about a musician’s emotional intention? In Proceeding of the Stockholm Music Acoustic Conference, Stockholm, pp 599–602.
Dalhaus, C., & Eggelbrecht, H. H. (1985). Was ist musik?. Wilhelmhaven: Heinrichshoken’s Verlag.
Davidson, J. W. (1994). What type of information is conveyed in the body movements of solo musician performers? Journal of Human Movement Studies, 6, 279–301.
Davidson, J. W. (1995). What does the visual information contained in music performances offer the observer? Some preliminary thoughts. In R. Steinberg (Ed.), Music and the mind machine: Psychophysiology and psychopathology of the sense of music (pp. 105–114). Heidelberg: Springer.
Davidson, J. W. (2001). The role of the body in the production and perception of solo vocal performance: A case study of Annie Lennox. Musicae Scientiae, 5(2), 235–256.
De Nora, T. (2000). Music in everyday life. Cambridge: Cambridge University Press.
Ellsworth, P. C., & Langer, E. J. (1976). Staring and approach: An interpretation of the stare as a nonspecific activator. Journal of Personality and Social Psychology, 33, 117–122.
Exline, R. V. (1971). Visual interaction: The glances of power and preference. In J. K. Cole (Ed.), Nebraska symposium on motivation (19th ed., pp. 162–205). Lincoln: University of Nebraska Press.
Freeman, N. H. (2004). Aesthetic judgment and reasoning. In E. Eisner & M. Day (Eds.), Handbook of research and policy in art education (pp. 815–828). Mahwah, NJ: Erlbaum.
Frischen, A., Bayliss, A. P., & Tipper, S. P. (2007). Gaze cueing of attention: Visual attention, social cognition, and individual differences. Psychological Bulletin, 133, 694–724.
Fry, R., & Smith, G. F. (1975). The effect of feedback and eye contact on performance of a digit-coding task. Journal of Social Psychology, 96, 145–146.
Fullwood, C. (2007). The effect of mediation on impression formation: A comparison of face-to-face and video-mediated conditions. Applied Ergonomics, 38, 267–273.
Fullwood, C., & Doherty-Sneddon, G. (2006). Effect of gazing at the camera during a video link on recall. Applied Ergonomics, 37(2), 167–175.
George, N., Driver, J., & Dolan, R. J. (2001). Seen gaze-direction modulates fusiform activity and its coupling with other brain areas during face processing. NeuroImage, 13, 1102–1112.
Grewe, O., Nagel, F., Kopiez, R., & Altenmüller, E. (2007). Emotions over time: Synchronicity and development of subjective, physiological, and facial affective reactions to music. Emotion, 7(4), 774–788.
Hanna, J. E., & Brennan, S. E. (2007). Speakers’ eye gaze disambiguates referring expressions early during face-to-face conversation. Journal of Memory and Language, 57, 596–615.
Kelley, D. H., & Gorham, J. (1988). Effect of immediacy on recall of I information. Communication Education, 37(3), 198–207.
Kleck, R., & Nuessle, W. (1968). Congruence between the indicative and communicative functions of eye contact in interpersonal relations. British Journal of Social and Clinical Psychology, 7, 241–246.
Kleinke, C. L. (1986). Gaze and eye contact: A research review. Psychological Bulletin, 100, 78–100.
Kurosawa, K., & Davidson, J. W. (2005). Nonverbal behaviours in popular music performance: A case study of The Corrs. Musicae Scientiae, 9, 111–133.
Langton, S. R. H., Watt, R. J., & Bruce, V. (2000). Do the eyes have it? Cues to the direction of social attention. Trends in Cognitive Sciences, 4, 50–59.
Larsen, R. J., & Shackelford, T. K. (1996). Gaze avoidance: Personality and social judgments of people who avoid direct face-to-face contact. Personality and Individual Differences, 21(6), 907–917.
MacCallum, R. C., & Austin, J. T. (2000). Applications of structural equation modelling in psychological research. Annual Review of Psychology, 51, 201–226.
Mason, M. F., Tatkow, E. P., & Macrae, C. N. (2005). The look of love: Gaze shifts and person perception. Psychological Science, 16(3), 236–239.
Medsker, G. J., Williams, L. J., & Holahan, P. J. (1994). A review of current practices for research. Journal of Management, 20, 439–464.
Mehrabrian, A., & Williams, M. (1969). Nonverbal concomitants of perceived and intended persuasiveness. Journal of Personality and Social Psychology, 13, 37–58.
Nichols, K., & Champness, B. (1971). Eye gaze and the GSR. Journal of Experimental Social Psychology, 7, 623–626.
Ohgushi, K. (2006). Interaction between auditory and visual information in conveyance of players’ intentions. Acoustic Science and Technology, 27, 336–339.
Ohgushi, K., & Hattori, M. (1996). Emotional communication in performance of vocal music. Interaction between auditory and visual information. In B. Pennycook & E. Costa-Giomi (Eds.), Proceedings of the Fourth International Conference on Music Perception and Cognition, Montreal, pp 269–274.
Ohgushi, K., & Sakuma, M. (1997). Conveyance of a player’s intentions in performances of percussion. Interaction between auditory and visual information. In Y. Ando & D. Noson (Eds.), Music and concert hall acoustics (pp. 397–400). New York: Academic Press.
Ryan, C., Wapnick, J., Lacaille, N., & Darrow, A.-A. (2006). The effects of various physical characteristics of high-level performers on adjudicators’ performance ratings. Psychology of Music, 34, 559–572.
Scherer, K. R., & Zentner, K. R. (2001). Emotional effects of music: Production rules. In J. A. Sloboda & P. N. Juslin (Eds.), Music and emotion (pp. 361–392). New York: Oxford University Press.
Sloboda, J. A., & Juslin, P. N. (2001). Psychological perspectives on music and emotion. In J. A. Sloboda & P. N. Juslin (Eds.), Music and emotion (pp. 71–104). New York: Oxford University Press.
Staneski, R. A., & Kleinke, C. L. (1978). Nondependence between verbal and nonverbal measures of attraction. Paper presented at the Meeting of the Western Psychological Association, San Francisco, April.
Thompson, W. F., Graham, P., & Russo, F. A. (2005). Seeing music performance: Visual influences on perception and experience. Semiotica, 1(4), 203–227.
Thompson, W. F., & Russo, F. A. (2004). Visual influences on the perception of emotion in music. In S. Lipscomb, R. Ashley, R. Gjerdingen, & P. Webster (Eds.), Proceedings of the Eighth International Conference for Music Perception and Cognition, Northwestern University, pp 198–199.
Thompson, W. F., & Russo, F. A. (2006). Facial expressions of pitch structure. In Proceedings of the 9th International Conference on Music Perception and Cognition, University of Bologna, pp 1141–1143.
Thompson, W. F., Russo, F. A., & Quinto, L. (2006). Preattentive integration of visual auditory dimensions of music. In Proceedings of the 2nd International Conference on Music and Gesture, Royal Northern College of Music, Manchester, pp 217–221.
Tomasello, M., & Carpenter, M. (2007). Shared intentionality. Developmental Science, 10, 121–125.
Vines, B. W., Krumhansl, C. L., Wanderley, M. M., & Levitin, D. J. (2006). Cross-modal interactions in the perception of musical performance. Cognition, 101, 80–113.
Wapnick, J., Darrow, A.-A., Kovacs, J., & Dalrymple, L. (1997). Effects of physical attractiveness on evaluation of vocal performance. Journal of Research in Music Education, 45, 470–479.
Wapnick, J., Mazza, J., & Darrow, A.-A. (1998). Effects of performer attractiveness, stage behavior and dress on violin performance evaluation. Journal of Research in Music Education, 46, 510–521.
Wapnick, J., Mazza, J., & Darrow, A.-A. (2000). Effects of performer attractiveness, stage behavior and dress on children’s piano performances. Journal of Research in Music Education, 48, 323–336.
About this article
Cite this article
Antonietti, A., Cocomazzi, D. & Iannello, P. Looking at the Audience Improves Music Appreciation. J Nonverbal Behav 33, 89 (2009). https://doi.org/10.1007/s10919-008-0062-x
- Eye contact
- Musical gesture
- Music perception
- Emotional response