Skip to main content
Log in

Non-determinism in the Uptake of Gestural Information

  • Original Paper
  • Published:
Journal of Nonverbal Behavior Aims and scope Submit manuscript

Abstract

It is well established that gestures and speech form an integrated system of communication; gestures that match the meaning of the speech they accompany favor the listener’s discourse comprehension, whereas mismatching gestures whose meaning conveys information contradicting that conveyed by speech, impair comprehension. A less investigated issue is whether or not the uptake of gestural information is a deterministic process. In line with recent studies in the literature, we purport that the process may be modulated by certain factors. In particular, we investigate the role of unrelated gestures whose meaning, which is irrelevant to the speech they accompany, could be neglected. The results of four experiments led us to conclude that unrelated gestures are not processed, and that the uptake of gestural information is a non-deterministic process.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. There is a general agreement in distinguishing three main categories of co-speech gestures, albeit under different labels (see, e.g., Bangerter 2004; Ekman and Friesen 1972; Hadar et al. 1998a; Kendon 1983; Krauss et al. 2000; McNeill et al. 1994): (1) Deictic gestures, which locate some aspects of the story being narrated in the physical space in front of the narrator, and establish a joint focus of attention with the addressee; (2) Representational gestures which pictorially represent concrete images of the speaker’s thoughts (iconic gestures), or pictorially represent an abstract concept (metaphoric gestures); (3) Batons which refer to the rhythm of speech, and tend to have the same form regardless of the content.

  2. Although a correct recollection may assume the form of a literal recollection, we consider as correct recollections revealing the construction of the discourse mental model only those in which participants reformulate through their own words the content in the semantic units. Verbatim recalls tend to occur in the absence of such a model (see also the results of Experiment 1).

References

  • Alibali, M. W., Flevares, L., & Goldin-Meadow, S. (1997). Assessing knowledge conveyed in gesture: Do teachers have the upper hand?. Journal of Educational Psychology, 89, 183–193.

    Article  Google Scholar 

  • Bangerter, A. (2004). Using pointing and describing to achieve joint focus of attention in dialogue. Psychological Science, 15, 415–419.

    Article  PubMed  Google Scholar 

  • Beattie, G., & Shovelton, H. (1999). Mapping the range of information contained in the iconic hand gestures that accompany spontaneous speech. Journal of Language and Social Psychology, 18, 438–462.

    Article  Google Scholar 

  • Berger, K. W., & Popelka, G. R. (1971). Extra-facial gestures in relation to speech reading. Journal of Communication Disorders, 3, 302–308.

    Article  Google Scholar 

  • Bonatti, L. (1998). Possibilities and real possibilities for a theory of reasoning. In Z. Pylyshyn (Ed.), Essays on representations (pp. 85–119). Norwood, NJ: Ablex Publishing Corporation.

    Google Scholar 

  • Brown, J. (1958). Some tests of the decay theory of immediate memory. Quarterly Journal of Experimental Psychology, 10, 12–21.

    Article  Google Scholar 

  • Bucciarelli, M. (2007). How the construction of mental models improves learning. Mind & Society, 6, 67–89.

    Article  Google Scholar 

  • Bucciarelli, M., & Cutica, I. (2012). Mental models in improving learning. In N. M. Seel (Ed.), Encyclopedia of the sciences of learning, 2012, Part 13 (pp. 2211–2212). NY: Springer.

    Google Scholar 

  • Cassell, J., McNeill, D., & McCullough, K.-E. (1999). Speech-gesture mismatches: Evidence for one underlying representation of linguistic and non-linguistic information. Pragmatics & Cognition, 7, 1–34.

    Article  Google Scholar 

  • Church, R. B., & Goldin-Meadow, S. (1986). The mismatch between gesture and speech as an index of transitional knowledge. Cognition, 23, 43–71.

    Article  PubMed  Google Scholar 

  • Church, R. B., Kelly, S. D., & Lynch, K. (2000). Immediate memory for mismatched speech and representational gesture across development. Journal of Nonverbal Behavior, 24, 151–174.

    Article  Google Scholar 

  • Cutica, I., & Bucciarelli, M. (2008). The deep versus the shallow: Effects of co-speech gestures in learning from discourse. Cognitive Science, 32, 921–935.

    Article  PubMed  Google Scholar 

  • Cutica, I., & Bucciarelli, M. (2012). Mental models in discourse processing. In N. M. Seel (Ed.), Encyclopedia of the sciences of learning, 2012, Part 13 (pp. 2213–2215). NY: Springer.

    Google Scholar 

  • Ekman, P., & Friesen, W. V. (1972). Hand movements. Journal of Communication, 22, 353–374.

    Article  Google Scholar 

  • Feyereisen, P. (2006). Further investigation on the mnemonic effect of gestures: Their meaning matters. European Journal of Cognitive Psychology, 18, 185–205.

    Article  Google Scholar 

  • Galati, A., & Samuel, A. G. (2011). The role of speech-gesture congruency and delay in remembering action events. Language & Cognitive Processes, 26, 406–436.

    Article  Google Scholar 

  • Garnham, A., & Oakhill, J. (1992). Discourse processing and text representation from a “mental models” perspective. Language & Cognitive Processes, 7, 193–204.

    Article  Google Scholar 

  • Garnham, A., Oakhill, J., & Cain, K. (1998). Selective retention of information about the superficial form of text: Ellipses with antecedents in main and subordinate clauses. The Quarterly Journal of Experimental Psychology: Section A, 51, 19–39.

    Article  Google Scholar 

  • Glenberg, A. M., Kruley, P., & Langston, W. E. (1994). Analogical processes in comprehension: Simulation of a mental model. In M. A. Gernsbacher (Ed.), Handbook of psycholinguistics (pp. 609–640). San Diego, CA: Academic Press.

    Google Scholar 

  • Goldin-Meadow, S. (1999). The role of gesture in communication and thinking. Trends in Cognitive Science, 3, 419–429.

    Article  Google Scholar 

  • Goldin-Meadow, S., & Alibali, M. W. (2013). Gesture’s role in speaking, learning, and creating language. Annual Review of Psychology, 64, 257–283.

    Article  PubMed Central  PubMed  Google Scholar 

  • Goldin-Meadow, S., Kim, S., & Singer, M. (1999). What the teacher’s hands tell the student’s mind about math. Journal of Educational Psychology, 91, 720–730.

    Article  Google Scholar 

  • Goldin-Meadow, S., & Sandhofer, C. M. (1999). Gestures convey substantive information about a child’s thoughts to ordinary listeners. Developmental Science, 2, 67–74.

    Article  Google Scholar 

  • Graesser, A. C., Millis, K. K., & Zwaan, R. A. (1997). Discourse comprehension. Annual Review of Psychology, 48, 163–189.

    Article  PubMed  Google Scholar 

  • Graesser, A. C., Singer, M., & Trabasso, T. (1994). Constructing inferences during narrative text comprehension. Psychological Review, 101, 371–395.

    Article  PubMed  Google Scholar 

  • Gullberg, M., & Kita, S. (2009). Attention to speech-accompanying gestures: Eye movements and information uptake. Journal of Nonverbal Behavior, 33, 251–277.

    Article  PubMed Central  PubMed  Google Scholar 

  • Habets, B., Kita, S., Shao, Z., Özyurek, A., & Hagoort, P. (2011). The role of synchrony and ambiguity in speech–gesture integration during comprehension. Journal of Cognitive Neuroscience, 23, 1845–1854.

    Article  PubMed  Google Scholar 

  • Hadar, U., Burnstein, A., Krauss, R. M., & Soroker, N. (1998a). Ideational gestures and speech: A linguistic investigation. Language and Cognitive Processes, 13, 59–76.

    Article  Google Scholar 

  • Hadar, U., Wenkert-Olenik, D., Krauss, R. M., & Soroker, N. (1998b). Gesture and the processing of speech: Neuropsychological evidence. Brain and Language, 62, 107–126.

    Article  PubMed  Google Scholar 

  • Hildebrandt, B., Moratz, R., Rickheit, G., & Sagerer, G. (1999). Cognitive modelling of vision and speech understanding. In G. Rickheit & C. Habel (Eds.), Mental models in discourse processing and reasoning (pp. 213–236). North Holland: Elsevier.

    Chapter  Google Scholar 

  • Holle, H., & Gunter, T. C. (2007). The role of iconic gestures in speech disambiguation: ERP evidence. Journal of Cognitive Neuroscience, 19, 1175–1192.

    Article  PubMed  Google Scholar 

  • Hostetter, A. B. (2011). When do gestures communicate? A meta-analysis. Psychological Bulletin, 137, 297–315.

    Article  PubMed  Google Scholar 

  • Iverson, J. M., & Goldin-Meadow, S. (2001). The resilience of gesture in talk: Gesture in blind speakers and listeners. Developmental Science, 4(4), 416–422.

    Article  Google Scholar 

  • Johnson-Laird, P. N. (1983). Mental models: Towards a cognitive science of language, and consciousness. Cambridge: Cambridge University Press.

    Google Scholar 

  • Johnson-Laird, P. N. (2006). How we reason. New York: Oxford University Press.

    Google Scholar 

  • Johnson-Laird, P. N., & Byrne, R. M. J. (1991). Deduction. London: Lawrence Erlbaum Associates Ltd.

    Google Scholar 

  • Johnson-Laird, P. N., & Stevenson, R. (1970). Memory for syntax. Nature, 227, 412.

    Article  PubMed  Google Scholar 

  • Kelly, S. D., Barr, D. J., Church, R. B., & Lynch, K. (1999). Offering a hand to pragmatic understanding: The role of speech and gesture in comprehension and memory. Journal of Memory and Language, 40, 577–592.

    Article  Google Scholar 

  • Kelly, S., & Church, R. B. (1998). A comparison between children’s and adults’ ability to detect conceptual information conveyed through representational gestures. Child Development, 69, 85–93.

    Article  PubMed  Google Scholar 

  • Kelly, S., Kravitz, C., & Hopkins, M. (2004). Neural correlates of bimodal speech and gesture comprehension. Brain and Language, 89, 253–260.

    Article  PubMed  Google Scholar 

  • Kelly, S. D., Ozyurek, A., & Maris, E. (2010). Two sides of the same coin: Speech and gesture mutually interact to enhance comprehension. Psychological Science, 21, 260–267.

    Article  PubMed  Google Scholar 

  • Kelly, S., Ward, S., Creigh, P., & Bartolotti, J. (2007). An intentional stance modulates the integration of gesture and speech during comprehension. Brain and Language, 101, 222–233.

    Article  PubMed  Google Scholar 

  • Kendon, A. (1983). Gesture and speech: How they interact. In J. A. Wiemann & R. P. Harrison (Eds.), Nonverbal interaction (pp. 13–45). Beverly Hills: Sage.

    Google Scholar 

  • Kendon, A. (2004). Gesture: Visible action as utterance. Cambridge: Cambridge University Press.

    Google Scholar 

  • Kintsch, W. (1998). Comprehension: A paradigm for cognition. Cambridge: Cambridge University Press.

    Google Scholar 

  • Krauss, R. M., Chen, Y., & Gottesman, R. (2000). Lexical gestures and lexical access: A process model. In D. McNeill (Ed.), Language and gesture (pp. 261–283). Cambridge: Cambridge University Press.

    Chapter  Google Scholar 

  • Krauss, R. M., & Hadar, U. (1999). The role of speech-related arm/hand gestures in word retrivial. In R. Campbell & L. Messing (Eds.), Gesture, speech, and sign (pp. 93–116). Oxford: Oxford University Press.

    Chapter  Google Scholar 

  • McNeil, N. M., Alibali, M. V., & Evans, J. L. (2000). The role of gesture in children’s comprehension of spoken language: Now they need them, now they don’t. Journal of Nonverbal Behavior, 24, 131–150.

    Article  Google Scholar 

  • McNeill, D. (1985). So you think gestures are nonverbal? Psychological Review, 92, 350–371.

    Article  Google Scholar 

  • McNeill, D. (1992). Hand and mind. Chicago: University of Chicago Press.

    Google Scholar 

  • McNeill, D. (2005). Gesture and thought. Chicago: University of Chicago Press.

    Book  Google Scholar 

  • McNeill, D., Cassell, J., & McCullough, K.-E. (1994). Communicative effects of speech mismatched gestures. Research on Language and Social Interaction, 27, 223–237.

    Article  Google Scholar 

  • Özyürek, A., Willems, R. M., Kita, S., & Hagoort, P. (2007). On-line integration of semantic information from speech and gesture: Insights from event-related brain potentials. Journal of Cognitive Neuroscience, 19, 605–616.

    Article  PubMed  Google Scholar 

  • Peterson, L. R., & Peterson, M. J. (1959). Short-term retention of individual verbal items. Journal of Experimental Psychology, 58, 193–198.

    Article  PubMed  Google Scholar 

  • Rickheit, G., & Sichelschmidt, L. (1999). Mental models: Some answers, some questions, some suggestions. In G. Rickheit & C. Habel (Eds.), Mental models in discourse processing and reasoning (pp. 9–40). North Holland: Elsevier.

    Chapter  Google Scholar 

  • Riseborough, M. G. (1981). Physiographic gestures as decoding facilitators: Three experiments exploring a neglected facet of communication. Journal of Nonverbal Behavior, 5, 172–183.

    Article  Google Scholar 

  • Singer, M. (1994). Discourse inference processes. In M. A. Gernsbacher (Ed.), Handbook of psycholinguistics (pp. 479–515). San Diego: Academic Press.

    Google Scholar 

  • Skipper, J., Goldin-Meadow, S., Nusbaum, H., & Small, S. (2007). Speech-associated gestures, Broca’s area, and the human mirror system. Brain and Language, 101, 260–277.

    Article  PubMed Central  PubMed  Google Scholar 

  • So, W. C., Sim Chen-Hui, C., & Low Wei-Shan, J. (2012). Mnemonic effect of iconic gesture and beat gesture in adults and children: Is meaning in gesture important for memory recall? Language & Cognitive Processes, 27, 665–681.

    Article  Google Scholar 

  • Thompson, L. A., Driscoll, D., & Markson, L. (1998). Memory for visual-spoken language in children and adults. Journal of Nonverbal Behavior, 22, 167–187.

    Article  Google Scholar 

  • Thompson, L. A., & Massaro, D. W. (1986). Evaluation and integration of speech and pointing gestures during referential understanding. Journal of Experimental Child Psychology, 42, 144–168.

    Article  PubMed  Google Scholar 

  • Thompson, L. A., & Massaro, D. W. (1994). Children’s integration of speech and pointing gestures in comprehension. Journal of Experimental Child Psychology, 57, 327–354.

    Article  PubMed  Google Scholar 

  • van Dijk, I. A., & Kintsch, W. (1983). Strategies of discourse comprehension. New York: Academic Press.

    Google Scholar 

  • Vendrame, M., Cutica, I., & Bucciarelli, M. (2010). “I see what you mean”: Oral deaf individuals benefit from speaker’s gesturing. European Journal of Cognitive Psychology, 22(4), 612–639.

    Article  Google Scholar 

  • Wang, L., & Chu, M. (2013). The role of beat gesture and pitch accent in semantic processing: An ERP study. Neuropsychologia, 51, 2847–2855.

    Article  PubMed  Google Scholar 

  • Willems, R., & Hagoort, P. (2007). Neural evidence for the interplay between language, gesture, and action: A review. Brain and Language, 101, 278–289.

    Article  PubMed  Google Scholar 

  • Wu, Y. C., & Coulson, S. (2005). Meaningful gestures: Electrophysiological indices of iconic gesture comprehension. Psychophysiology, 42, 654–667.

    Article  PubMed  Google Scholar 

  • Wu, Y. C., & Coulson, S. (2010). Gestures modulate speech processing early in utterances. Cognitive Neuroscience & Neuropsychology, 21, 522–526.

    Google Scholar 

  • Zwaan, R. A., Langston, M. C., & Graesser, A. C. (1995a). The construction of situation models in narrative comprehension: An event-indexing model. Psychological Science, 5, 292–297.

    Article  Google Scholar 

  • Zwaan, R. A., Magliano, J. P., & Graesser, A. C. (1995b). Dimensions of situation model construction in narrative comprehension. Journal of Experimental Psychology. Learning, Memory, and Cognition, 21, 386–397.

    Article  Google Scholar 

  • Zwaan, R. A., & Radvansky, G. A. (1998). Situation models in language comprehension and memory. Psychological Bulletin, 123, 162–185.

    Article  PubMed  Google Scholar 

Download references

Acknowledgments

This research was supported by a grant from Italian MIUR to the second author (Prin project 2010-2011, prot. 2010RP5RNM).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ilaria Cutica.

Ethics declarations

Conflict of interest

The authors declare they do not have personal nor institutional conflicts of interest.

Appendices

Appendix 1

Appendix 1A: The Funfair Discourse Used in Experiments 1 and 2 (Semantic Units are Separated by Slashes)

It was there, at the funfair, it was there that I found her,/ and it was at the funfair that I lost her./ It was a vast funfair./ A funfair with shooting-ranges and candy floss stalls/ and Japanese bagatelle tables, stalls with bottles of champagne/ and showmen’s booths and roundabouts./ And the roundabouts turned and creaked/ and the candy floss scented the air/ and the rifles shot./ I was shooting at the target./ I can shoot at the target very well and I am proud of it./ No, wait a moment, I am wrong!/ I did not meet her at the shooting-range./ I met her at the candy floss stall. Yes, it was at the candy floss stall that I found her./ The candy floss scented the air,/ and she was eating it/ and she blew on her candy/ and I was all covered with white powder./ She started laughing/ and I asked her: “What’s your name?”/ And she shouted to me: “I’ll tell you later”./

Appendix 1B: Examples of Co-speech Gestures Produced by the Animated Agent in Both the Gesture Condition and the Unrelated Condition (Experiments 1 and 2—Funfair Discourse)

The left and the central columns present the co-occurrence of speech and gestures in the gesture condition, whereas the right and the central columns present the co-occurrence of speech and gestures in the unrelated conditions. Please note that the beginning of each condition is highlighted in bold.

Underscoring of semantic units indicates the duration of the gesture with the corresponding number.

Semantic units gesture condition

Computer animated agent’s gestures

Semantic units unrelated condition

BEGINNING OF GESTURE CONDITION FICTION

  

It was there, at the funfair, it was there that I found her,

Hands still, resting on thighs

and it was at the funfair that I lost her

Hands still, resting on thighs

 

It was a vast funfair 1

(1) Raises both hands simultaneously to a position at the level of the navel, palms down

From here, the hands are raised to head height in a continuous movement, curving outwards as if describing a circle, with palms facing forward towards the hypothetical interlocutor. Both hands return to chest level

The candy floss scented the air,

and she was eating it, and she blew on her candy 1

A funfair with shooting-ranges 2 and candy floss stalls,3

(2) The right hand closes into a fist, and the left hand remains open with palm facing to the right; then both hands open and assume a position as if to point to the agent’s left; the left hand is slightly higher than the right; the left arm is almost fully straightened, while the right one is more bent. Then both hands move down to the thighs

(3) The right hand is rapidly raised to stomach level, outside the line of the body; the hand is half open as if to indicate a direction; it is then quickly moved back to rest on the thigh

And I was all covered with white powder 2

She started laughing 3

and Japanese bagatelle tables 4 stalls with bottles of champagne, 5

(4) The right hand is raised from the right leg and moves to one side of the body (further from the body than in the previous movement), almost to stomach height. The hand is half open, with the fingers towards a hypothetical interlocutor, as if to indicate a direction; the arm is partly bent

and I asked her: “What’s your name?” 4

(5) Just before the right hand comes to rest on the leg, the left hand is raised from the other leg and moves forward; the thumb and index finger are extended and pointing, while the other fingers are partly bent and the palm is almost hidden by the fingers. The hand then returns directly to the left leg

And she shouted to me, “I’ll5 tell you later”

and showmen’s booths 6 and roundabouts 7

(6) Both hands are raised simultaneously from the legs. The right hand is open with the palm towards the right leg, away from the body, with the thumb pointed upwards; the left hand is open and moves away from the body, with the palm still facing the thigh, and the index finger and thumb straight. Both hands then move back to the thighs

(7) The left hand is raised immediately from the left leg and moved forward and to the right, showing the palm with fingers open, to chest height (the elbow is bent at 90°), and remains in that position for 2 s; the right hand is open and remains still at hip height, with the palm towards the thigh

BEGINNING OF UNRELATED CONDITION FICTION

( The actor is silent ) 6

It was there, at the funfair7

And the roundabouts turned and creaked, 8

(8) Both hands start to move from left to right, held forwards at chest height; the right hand continues the movement gradually downwards to hip level. Then both hands are moved back towards the legs but do not come to rest on them. The left hand, open and palm facing downwards, slides downwards and outwards from the left leg, with the arm extended. In a continuous movement, the hand moves back upwards in a circular fashion and comes back to rest on the left thigh

it was there 8 that I found her,

and the candy floss scented the air, 9

(9) Both hands are moved away from the legs up to the mouth area, in a cupped position 2–3 cm apart with the palms towards the mouth and fingers slightly bent, moving rapidly backwards and forwards for around 2 s

and it was at the funfair that I lost her 9

and the rifles shot 10

(10) Then the open right hand moves to the chest area, while the left hand moves down to hip level. The left hand, open and palm upwards, is raised in a circular movement, and then from the hip area it is moved up to the chest, near to the right hand

It was a vast funfair 10

I was shooting at the target 11

(11) Both hands are at chest height, with palms facing each other; both hands are moved towards the left. The left hand moves away from the body, while the right remains at stomach level. The ring and little fingers of the left hand close, while the other digits are straight; at the same time, the middle, ring and little fingers of the right hand close, while the thumb and index finger remain straight and extended

A funfair 11 with shooting-ranges 12 and candy floss stalls, 13

I can shoot at the target very well 12 and I am proud of it 13

(12) From that position, the hands are raised momentarily to shoulder height, with index fingers pointing upwards, and then immediately lowered to their initial position. The right hand is open, with the palm towards the stomach and the arm bent; the left hand is at the front, with thumb and index finger straight, while the other fingers and the right arm are slightly bent

(13) The hands are brought up to cover the face for 2 s; the hands are open, with the palms towards the face

No, no,14 wait a moment15

(14) The hands remain in this position for 2 s and then slowly descend to hip level. When the hands reach stomach level, the right hand moves towards the left, while the left hand opens and moves forwards, with the fingers almost fully extended. Then the hands move directly towards the thighs, but do not actually come to rest on them

(15) The left hand is raised quickly up to the top of the chest, with the palm towards the right and fingers open. The right hand is raised slightly, up to hip level, with the palm still facing downwards. The fingers are semi-extended. Both hands move rhythmically at around hip height and are brought slightly outwards

and Japanese bagatelle tables 14 stalls with bottles of champagne, 15

I am wrong!

The hands stay still near the legs for 7 s

and showmen’s booths and roundabouts

Appendix 1C: Examples of Sentences Used for the Recognition Task of Experiment 1

Literal :

I can shoot at the target very well

Paraphrases :

I am really able to shoot at the target

Wrong :

I do not know how to shoot at the target

Literal :

I shot, the egg popped up. I turned aside, she wasn’t there any more

Paraphrases :

When I shot at the egg she disappeared

Wrong :

While I was going to shoot to the egg, I turned aside and she wasn’t there any more

Appendix 2

Appendix 2A: The Color Discourse Used in Experiments 3 and 4 (Semantic Units are Separated by Slashes)

It’s beyond dispute/ that colors carry strong expressive components./ Some attempts have been made to describe the specific expressive characters of the various colors/ and to draw some general conclusions from the symbolic use the different cultures have made of them./ There is a very widespread belief that the expression of colors is based on association./ Therefore, red should be considered exciting/ because it reminds us of the connotations of fire, blood and revolution./ Green evokes the restorative thought of nature,/ and blue is refreshing like water./ The theory of association, however, is not more interesting or prolific in this field than in others./ In addition, the effects of colors are too direct and spontaneous/ to be simply the result of an interpretation given through knowledge./ On the other hand, no hypothesis has been advanced so far on the kind of physiological process/ that could help to explain the influence of colors on the organism./ The need to discuss the form makes us feel on more solid ground, though,/ as we can compare the expression of specific patterns/with that or more general properties/ such as spatial orientation, balance or the geometrical characteristics of the outlines.

Appendix 2B: Examples of Co-speech Gestures Produced by the Animated Agent in Both the Gesture Condition and the Unrelated Condition (Experiments 3 and 4—Color Discourse)

The left and the central columns present the co-occurrence of speech and gestures in the gesture condition, whereas the right and the central columns present the co-occurrence of speech and gestures in the unrelated conditions. Please note that the beginning of each condition is highlighted in bold. Underscoring of semantic units indicates the duration of the gesture with the corresponding number.

NB: if the numbering of the gestures corresponding to the semantic units is not perfectly sequential, this is likely to be due to changes in the sentence structure in the translation from the Italian. We have therefore preferred to maintain the speech-gesture correspondence rather than the sequential gesture numbering.

Semantic units gesture condition

Computer animated agent’s gestures

Semantic units unrelated condition

BEGINNING OF GESTURE CONDITION FICTION

  

It’s beyond dispute that colors carry strong expressive components 1

(1) The agent’s hands are resting on his legs at knee height. He raises both hands simultaneously to the level of the navel. From here, the hands open outwards with palms facing slightly forwards, towards the hypothetical interlocutor. The hands are then brought swiftly back to the thighs

There is a very widespread belief that the expression of colors is based on association 1

Some attempts have been made to describe the specific expressive characteristics of the various colors 2

(2) The left hand is raised from the left leg up to the mouth area, half open and with the palm towards the right. The hand moves down to chest level in a continuous movement and then assumes an up-down undulating motion. Initially, the fingers are all partly closed, then the index finger gradually extends almost completely; the palm of the hand faces towards the interlocutor (first at mouth level and then at chest height)

At the same time, the open right hand is raised from the right leg to a position at the level of the navel, with the palm facing to the left and fingers half extended

Therefore, red should be considered exciting, 2

and to draw some general conclusions from the symbolic use the different cultures have made of them. 3

(3) From the chest, both hands move down onto the legs and closer to each other (to about 2–3 cm apart), with palms facing each other, fingers extended and thumbs pointing upwards. The right hand is raised from the right leg up to chest level, with fingers open, while the left hand rises at the same time in line with the right hand, first with fingers half closed and then extended. The right hand is then brought back onto the right leg, with palm facing to the left and thumb raised, while the left hand moves in small circles, with fingers semi-extended. Finally, the left hand moves back onto the left leg, with the palm facing to the right and fingers extended

because it reminds us of the connotations of fire, blood and revolution 3

There is a very widespread6 belief4 that the expression of colors is based on association5

(4) The left hand is raised from the left leg up to neck height, with the index finger pointing upwards and the other fingers half closed, whereas the right hand remains open at hip level, with the palm facing to the left

(5) The left hand moves down from the mouth to shoulder height, then in an up-down waving motion, first with the hand open and then with fingers half closed, with the palm facing downwards and to the right

(6) The right hand is raised from the right thigh up to the same height as the left hand; both palms are parallel to the thighs. The hands move forwards, and both palms turn towards the agent; the arms remain partly bent

Green 4 evokes the restorative thought of nature 5,

and blue 6

is refreshing like water 7

Therefore, red 7 should be considered exciting, 8

7) Both hands are brought up to chest height, first closer together then further apart again

(8) With both hands open at chest level, the right hand moves closer to then further from the left hand

because it reminds us of the connotations of fire 8, blood 9 and revolution 10

(8) From the chest, both hands are moved to the shoulder area, with palms facing each other and fingers extended. They are then quickly lowered to waist level, with fingers closed. The fingers immediately reopen and the right hand is raised up to the right shoulder; the fingers open and the index finger points upwards above the shoulder, with the palm turned slight to the left. The left hand stops at chest height, facing forwards with the palm open

(9) The left hand moves up and down repeatedly, while the right hand remains at shoulder level

(10) Both hands are moved to chest level; the fingers are open and the hands extended forwards with palms facing the interlocutor. The index fingers appear to point upwards

The theory of association, however, 8 is not more interesting or prolific 9 in this field 10 than in others

Green 11 evokes the restorative thought of nature 12,

(11) The left hand is lowered slightly and turned, in an open position, so that the palm faces upward, with the thumb and index finger more extended than the other digits. It then turns again, so as to finish with the palm facing the right hand

(12) Both hands are turned towards the agent and slightly folded (i.e., with the fingers of the right hand covering those of the left). Both hands then move outwards, with a turn of the arms, until the right hand is outside the line of the body, while the left hand remains slightly closer to the body. The hands return to chest level

In addition,11 the effects of colors are too direct and 12 spontaneous

and blue 13 is refreshing like water 14

(13) The right hand moves down from the chest onto the right thigh, while the left hand is raised to shoulder level

(14) The left hand is moved repeatedly up and down, moving the fingers at the same time

to be simply the result 13 of an interpretation given 14 through knowledge

The theory of association, however, is not more interesting 15 or prolific 16 in this field than in others

(15) The left hand moves down from the shoulder to the chest and the palm is turned towards the interlocutor, with fingers half closed. Simultaneously, the right hand is raised from the right leg up to the same height as the left hand, with the palm facing towards it

(16) Both hands close and are then moved from the chest to rest on the legs

BEGINNING OF UNRELATED CONDITION FICTION

It’s beyond dispute 15

that colors carry strong expressive components 16

In addition 17, the effects of colors are too direct and spontaneous 18

(17) The hands are raised to hip level, with palms facing each other and fingers open

(18) The right hand remains at hip level, open and with the palm facing left, and moves outwards away from the body. The left hand is raised to chest height, outside the line of the body, and is pushed quickly forwards, with fingers open, and then pulled back equally quickly towards the chest

(19) The right hand is raised from the right leg by about 5 cm, with the palm towards the leg, while the left hand remains at chest level

Some attempts have been made 17 to describe

the specific 18 expressive characteristics of the various colors 19

to be simply the result 19 of an interpretation given through knowledge 20

(20) The right hand moves to around stomach level, away from the body; the hand is open with the index finger almost fully extended, the other fingers half closed, and the palm facing left. It is then brought back to rest on the leg, closing almost into a fist. At the same time, the left hand also moves slightly towards the side of the body, then returns to its previous position and then moves towards the side again

and to draw some general conclusions from the symbolic use the different cultures have made 20 of them

Appendix 2C: Examples of Sentences Used for the Recognition Task of Experiment 3

Literal :

Green evokes the restorative thought of nature

Paraphrases :

The color evoking the restorative thought of nature is green

Wrong :

Green evokes the sense of tiredness of nature

Literal :

Blue is refreshing like water

Paraphrases :

Blue color gives a feeling of freshness like water

Wrong :

Blue color is fresh like sea

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cutica, I., Bucciarelli, M. Non-determinism in the Uptake of Gestural Information. J Nonverbal Behav 39, 289–315 (2015). https://doi.org/10.1007/s10919-015-0215-7

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10919-015-0215-7

Keywords

Navigation