Abstract
This paper addresses the automatic classification of the semiotic type of hand gestures produced by six English speaking politicians in different contexts from the annotations of the gesture form and shape. It builds upon and extends to more data the work proposed in [24]. Gestures contribute to the successful delivery of a message in face-to-face communication by reinforcing what is conveyed by speech or adding new information to it. Gestures are multi-functional and can have different meanings depending on the context. The identification of the semiotic type of gestures is the first step towards their automatic interpretation. Moreover, exploiting the relation between gesture form and function, described in the literature, on many types of data, contributes to the automatic generation of gestures e.g. in infocommunication systems. In the present work we annotated the semiotic types of hand gestures produced by Boris Johnson in two question/answer sessions at the British House of Common and added it to the annotations of the gestures of five other politicians in different contexts. We trained various classifiers to identify the semiotic type of the hand gestures. The F1 score obtained by the best performing algorithm on the classification of four semiotic types is 0.68. This result confirms on a larger data set the results obtained in the preceding pilot study indicating that it is possible to identify the semiotic types of hand gestures from their coarse-grained form features in nearly one third of the cases.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
The videos can be seen at https://www.youtube.com/watch?v=ZUjONj5l_Ko and https://www.youtube.com/watch?v=e23sgoDaUus, respectively.
- 2.
The annotations are available from the author of the article.
- 3.
Significance is measured with paired corrected t-test, and the significance level is \(p<0.001\).
References
Alexanderson S, House D, Beskow J (2016) Automatic annotation of gestural units in spontaneous face-to-face interaction. In: Proceedings of the workshop on multimodal analyses enabling artificial agents in human-machine interaction, MA3HMI ’16. ACM, New York, NY, USA, pp 15–19
Allwood J, Cerrato L, Jokinen K, Navarretta C, Paggio P (2007) The MUMIN coding scheme for the annotation of feedback, turn management and sequencing: multimodal corpora for modelling human multimodal behaviour. Spec Issue Int J Lang Resour Eval 41(3–4), 273–287
Anwar S, Sinha S, Vivek S, Ashank V (2019) Hand gesture recognition: a survey. In: Nath V, Mandal J (eds) Nanoelectronics, circuits and communication systems. Lecture notes in electrical engineering, vol 511, pp 365–371. Springer, Singapore
Baranyi P (ed) (2018) Special issue on cognitive infocommunications, Acta Politechnica Hungarica. J Applies Sci 15
Baranyi P, Csapo A, Sallai G (2015) Cognitive infocommunications (CogInfoCom). Springer International Publishing
Cheok M, Omar Z, Jaward M (2019) A review of hand gesture and sign language recognition techniques. Int J Mach Learn Cyber 10:131–153
Ekman P, Friesen WV (1969) The repertoire of nonverbal behavior: categories, origins, usage, and coding. Semiotica 1(1):49–98
Esposito AM (2007) Marinaro: what pauses can tell us about speech and gesture partnership. In: Fundamentals of verbal and nonverbal communication and the biometric issue, NATO publishing series sub-series E: Human and societal dynamics, vol 18. IOS Press, pp 45–57
Gebre BG, Crasborn O, Wittenburg P, Drude S, Heskes T (2014) Unsupervised feature learning for visual sign language identification. In: Proceedings of the 52nd annual meeting of the association for computational linguistics. ACL, pp 370–376
Kendon A (1972) Some relationships between body motion and speech. In: Seigman A, Pope B (eds) Studies in dyadic communication. Pergamon Press, Elmsford, New York, pp 177–2016
Kendon A (2004) Gesture: visible action as utterance. Cambridge University Press
Keskin C, Aran O, Akarun L (2011) Hand gesture analysis. In: Salah AA, Gevers T (eds) Computer analysis of human behavior. Springer, London, pp 125–149
Kipp M (2004) Gesture generation by imitation—from human behavior to computer character animation. PhD thesis, Saarland University, Saarbruecken, Germany, Boca Raton, Florida, dissertation.com
Klempous R, Nikodem J, Baranyi P (eds) (2019) Cognitive Infocommunications. In: Theory and applications. Springer International Publishing, Cham
Krauss R, Chen Y, Gottesman RF (2000) Lexical gestures and lexical access: a process model. In: McNeill D (ed) Language and gesture. Cambridge University Press, pp 261–283
Li G, Tang H, Sun Y, Kong J, Jiang G, Jiang D, Tao B, Xu S, Liu H (2019) Hand gesture recognition based on convolution neural network. Cluster Comput 22:2719–2729
Loehr DP (2004) Gesture and intonation. PhD thesis, Georgetown University
Loehr DP (2007) Aspects of rhythm in gesture and speech. Gesture 7(2):179–214
McNeill D (1992) Hand and mind: what gestures reveal about thought. University of Chicago Press, Chicago
McNeill D (2005) Gesture and thought. University of Chicago Press
Navarretta C (2017) Barack Obama’s pauses and gestures in humorous speeches. In: Proceedings of the 4th European and 7th Nordic symposium on multimodal communication (MMSYM 2016), Copenhagen, 29–30 Sept 2016, no. 141 in Linköping University conference proceedings. Linköping University Electronic Press, Linköpings Universitet, pp 28–36
Navarretta C (2017) Prediction of audience response from spoken sequences, speech pauses and co-speech gestures in humorous discourse by Barack Obama. In: 8th IEEE international Conference on Cognitive Infocommunications (CogInfoCom), pp 327–332. https://doi.org/10.1109/CogInfoCom.2016.7804554
Navarretta C (2018) The automatic annotation of the semiotic type of hand gestures in Obama’ s humorous speeches. In: Proceedings of the eleventh international conference on Language Resources and Evaluation (LREC 2018). ELRA, Myiazaki, Japan, pp 1067–1072
Navarretta C (2019) Form and function of hand gestures for interpretation and generation. In: Proceedings of the 10th IEEE international conference on cognitive infocommunications: CogInfoCom. IEEE, pp 215–220
Navarretta C, Paggio P (2013) Multimodal behaviour and interlocutor identification in political debates. In: Poggi I, et al (ed) Multimodal communication in political speech shaping minds and social action, no 7688 in LNAI. Springer-Verlag Berlin Heidelberg, pp 84–98
Nespoulous JL, Lecours A (1986) Gesture: nature and function. In: Nespoulous JL, Peron P, Lecours A (eds) The biological foundations of gestures: motor and semiotic aspects, Chap. 2. Lawrence Erlbaum Association, Hillsdale, NY, pp 49–62
Peirce CS (1931) Collected papers of Charles Sanders Peirce, 1931–1958, vol 8. Harvard University Press, Cambridge, MA
Poggi I, Caldognetto E (1997) Mani che parlano. Unipress, Padova
Wundt WM (1873) The language of gestures. Mouton
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Navarretta, C. (2023). Identifying the Function of Hand Gestures from Their Form in Political Speech. In: Klempous, R., Nikodem, J., Baranyi, P.Z. (eds) Accentuated Innovations in Cognitive Info-Communication. Topics in Intelligent Engineering and Informatics, vol 16. Springer, Cham. https://doi.org/10.1007/978-3-031-10956-0_10
Download citation
DOI: https://doi.org/10.1007/978-3-031-10956-0_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-10955-3
Online ISBN: 978-3-031-10956-0
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)