Skip to main content

Identifying the Function of Hand Gestures from Their Form in Political Speech

  • Chapter
  • First Online:
Accentuated Innovations in Cognitive Info-Communication

Part of the book series: Topics in Intelligent Engineering and Informatics ((TIEI,volume 16))

  • 173 Accesses

Abstract

This paper addresses the automatic classification of the semiotic type of hand gestures produced by six English speaking politicians in different contexts from the annotations of the gesture form and shape. It builds upon and extends to more data the work proposed in [24]. Gestures contribute to the successful delivery of a message in face-to-face communication by reinforcing what is conveyed by speech or adding new information to it. Gestures are multi-functional and can have different meanings depending on the context. The identification of the semiotic type of gestures is the first step towards their automatic interpretation. Moreover, exploiting the relation between gesture form and function, described in the literature, on many types of data, contributes to the automatic generation of gestures e.g. in infocommunication systems. In the present work we annotated the semiotic types of hand gestures produced by Boris Johnson in two question/answer sessions at the British House of Common and added it to the annotations of the gestures of five other politicians in different contexts. We trained various classifiers to identify the semiotic type of the hand gestures. The F1 score obtained by the best performing algorithm on the classification of four semiotic types is 0.68. This result confirms on a larger data set the results obtained in the preceding pilot study indicating that it is possible to identify the semiotic types of hand gestures from their coarse-grained form features in nearly one third of the cases.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The videos can be seen at https://www.youtube.com/watch?v=ZUjONj5l_Ko and https://www.youtube.com/watch?v=e23sgoDaUus, respectively.

  2. 2.

    The annotations are available from the author of the article.

  3. 3.

    Significance is measured with paired corrected t-test, and the significance level is \(p<0.001\).

References

  1. Alexanderson S, House D, Beskow J (2016) Automatic annotation of gestural units in spontaneous face-to-face interaction. In: Proceedings of the workshop on multimodal analyses enabling artificial agents in human-machine interaction, MA3HMI ’16. ACM, New York, NY, USA, pp 15–19

    Google Scholar 

  2. Allwood J, Cerrato L, Jokinen K, Navarretta C, Paggio P (2007) The MUMIN coding scheme for the annotation of feedback, turn management and sequencing: multimodal corpora for modelling human multimodal behaviour. Spec Issue Int J Lang Resour Eval 41(3–4), 273–287

    Google Scholar 

  3. Anwar S, Sinha S, Vivek S, Ashank V (2019) Hand gesture recognition: a survey. In: Nath V, Mandal J (eds) Nanoelectronics, circuits and communication systems. Lecture notes in electrical engineering, vol 511, pp 365–371. Springer, Singapore

    Google Scholar 

  4. Baranyi P (ed) (2018) Special issue on cognitive infocommunications, Acta Politechnica Hungarica. J Applies Sci 15

    Google Scholar 

  5. Baranyi P, Csapo A, Sallai G (2015) Cognitive infocommunications (CogInfoCom). Springer International Publishing

    Google Scholar 

  6. Cheok M, Omar Z, Jaward M (2019) A review of hand gesture and sign language recognition techniques. Int J Mach Learn Cyber 10:131–153

    Article  Google Scholar 

  7. Ekman P, Friesen WV (1969) The repertoire of nonverbal behavior: categories, origins, usage, and coding. Semiotica 1(1):49–98

    Article  Google Scholar 

  8. Esposito AM (2007) Marinaro: what pauses can tell us about speech and gesture partnership. In: Fundamentals of verbal and nonverbal communication and the biometric issue, NATO publishing series sub-series E: Human and societal dynamics, vol 18. IOS Press, pp 45–57

    Google Scholar 

  9. Gebre BG, Crasborn O, Wittenburg P, Drude S, Heskes T (2014) Unsupervised feature learning for visual sign language identification. In: Proceedings of the 52nd annual meeting of the association for computational linguistics. ACL, pp 370–376

    Google Scholar 

  10. Kendon A (1972) Some relationships between body motion and speech. In: Seigman A, Pope B (eds) Studies in dyadic communication. Pergamon Press, Elmsford, New York, pp 177–2016

    Chapter  Google Scholar 

  11. Kendon A (2004) Gesture: visible action as utterance. Cambridge University Press

    Google Scholar 

  12. Keskin C, Aran O, Akarun L (2011) Hand gesture analysis. In: Salah AA, Gevers T (eds) Computer analysis of human behavior. Springer, London, pp 125–149

    Chapter  Google Scholar 

  13. Kipp M (2004) Gesture generation by imitation—from human behavior to computer character animation. PhD thesis, Saarland University, Saarbruecken, Germany, Boca Raton, Florida, dissertation.com

    Google Scholar 

  14. Klempous R, Nikodem J, Baranyi P (eds) (2019) Cognitive Infocommunications. In: Theory and applications. Springer International Publishing, Cham

    Google Scholar 

  15. Krauss R, Chen Y, Gottesman RF (2000) Lexical gestures and lexical access: a process model. In: McNeill D (ed) Language and gesture. Cambridge University Press, pp 261–283

    Google Scholar 

  16. Li G, Tang H, Sun Y, Kong J, Jiang G, Jiang D, Tao B, Xu S, Liu H (2019) Hand gesture recognition based on convolution neural network. Cluster Comput 22:2719–2729

    Article  Google Scholar 

  17. Loehr DP (2004) Gesture and intonation. PhD thesis, Georgetown University

    Google Scholar 

  18. Loehr DP (2007) Aspects of rhythm in gesture and speech. Gesture 7(2):179–214

    Article  Google Scholar 

  19. McNeill D (1992) Hand and mind: what gestures reveal about thought. University of Chicago Press, Chicago

    Google Scholar 

  20. McNeill D (2005) Gesture and thought. University of Chicago Press

    Google Scholar 

  21. Navarretta C (2017) Barack Obama’s pauses and gestures in humorous speeches. In: Proceedings of the 4th European and 7th Nordic symposium on multimodal communication (MMSYM 2016), Copenhagen, 29–30 Sept 2016, no. 141 in Linköping University conference proceedings. Linköping University Electronic Press, Linköpings Universitet, pp 28–36

    Google Scholar 

  22. Navarretta C (2017) Prediction of audience response from spoken sequences, speech pauses and co-speech gestures in humorous discourse by Barack Obama. In: 8th IEEE international Conference on Cognitive Infocommunications (CogInfoCom), pp 327–332. https://doi.org/10.1109/CogInfoCom.2016.7804554

  23. Navarretta C (2018) The automatic annotation of the semiotic type of hand gestures in Obama’ s humorous speeches. In: Proceedings of the eleventh international conference on Language Resources and Evaluation (LREC 2018). ELRA, Myiazaki, Japan, pp 1067–1072

    Google Scholar 

  24. Navarretta C (2019) Form and function of hand gestures for interpretation and generation. In: Proceedings of the 10th IEEE international conference on cognitive infocommunications: CogInfoCom. IEEE, pp 215–220

    Google Scholar 

  25. Navarretta C, Paggio P (2013) Multimodal behaviour and interlocutor identification in political debates. In: Poggi I, et al (ed) Multimodal communication in political speech shaping minds and social action, no 7688 in LNAI. Springer-Verlag Berlin Heidelberg, pp 84–98

    Google Scholar 

  26. Nespoulous JL, Lecours A (1986) Gesture: nature and function. In: Nespoulous JL, Peron P, Lecours A (eds) The biological foundations of gestures: motor and semiotic aspects, Chap. 2. Lawrence Erlbaum Association, Hillsdale, NY, pp 49–62

    Google Scholar 

  27. Peirce CS (1931) Collected papers of Charles Sanders Peirce, 1931–1958, vol 8. Harvard University Press, Cambridge, MA

    Google Scholar 

  28. Poggi I, Caldognetto E (1997) Mani che parlano. Unipress, Padova

    Google Scholar 

  29. Wundt WM (1873) The language of gestures. Mouton

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Costanza Navarretta .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Navarretta, C. (2023). Identifying the Function of Hand Gestures from Their Form in Political Speech. In: Klempous, R., Nikodem, J., Baranyi, P.Z. (eds) Accentuated Innovations in Cognitive Info-Communication. Topics in Intelligent Engineering and Informatics, vol 16. Springer, Cham. https://doi.org/10.1007/978-3-031-10956-0_10

Download citation

Publish with us

Policies and ethics