Skip to main content

Engagement Perception and Generation for Social Robots and Virtual Agents

  • Chapter
  • First Online:
Toward Robotic Socially Believable Behaving Systems - Volume I

Part of the book series: Intelligent Systems Reference Library ((ISRL,volume 105))

Abstract

Technology is the future , woven into every aspect of our lives, but how are we to interact with all this technology and what happens when problems arise? Artificial agents, such as virtual characters and social robots could offer a realistic solution to help facilitate interactions between humans and machines—if only these agents were better equipped and more informed to hold up their end of an interaction. People and machines can interact to do things together, but in order to get the most out of every interaction, the agent must to be able to make reasonable judgements regarding your intent and goals for the interaction. We explore the concept of engagement from the different perspectives of the human and the agent. More specifically, we study how the agent perceives the engagement state of the other interactant, and how it generates its own representation of engaging behaviour. In this chapter, we discuss the different stages and components of engagement that have been suggested in the literature from the applied perspective of a case study of engagement for social robotics, as well as in the context of another study that was focused on gaze-related engagement with virtual characters.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    NAO, http://www.aldebaran-robotics.com/.

  2. 2.

    Microsoft Kinect, http://www.microsoft.com/en-us/kinectforwindows/.

  3. 3.

    Affectiva Q Sensor, http://www.qsensortech.com/.

References

  1. Asteriadis S, Karpouzis K, Kollias S (2009) Feature extraction and selection for inferring user engagement in an hci environment. In: Human-computer interaction. Springer, New Trends, pp 22–29

    Google Scholar 

  2. Baron-Cohen S (1994) How to build a baby that can read minds: cognitive mechanisms in mind reading. Curr Psychol Cogn 13:513–552

    Google Scholar 

  3. Castellano G, Pereira A, Leite I, Paiva A, Mcowan PW (2009) Detecting user engagement with a robot companion using task and social interaction-based features interaction scenario. In: Proceedings of the 2009 international conference on multimodal interfaces, pp 119–125

    Google Scholar 

  4. Christenson SL, Reschly AL, Wylie C (2012) Handbook of research on student engagement. Springer, Boston

    Book  Google Scholar 

  5. Corrigan LJ, Basedow C, Küster D, Kappas A, Peters C, Castellano G (2014) Mixing implicit and explicit probes: finding a ground truth for engagement in social human-robot interactions. In: Proceedings of the 2014 ACM/IEEE international conference on HRI. ACM, pp 140–141

    Google Scholar 

  6. Corrigan LJ, Basedow C, Küster D, Kappas A, Peters C, Castellano G (2015) Perception matters! Engagement in task orientated social robotics. In: IEEE RO-MAN 2015. doi:10.1109/ROMAN.2015.7333665

  7. Goffman E (2008) Behavior in public places. Simon and Schuster, New York

    Google Scholar 

  8. Kappas A, Krämer N (2011) Studies in emotion and social interaction. Face-to-face communication over the internet: emotions in a web of culture, language, and technology. Cambridge University Press, Cambridge

    Chapter  Google Scholar 

  9. Kendon A (1990) Conducting interaction: patterns of behavior in focused encounters, vol 7. CUP Archive, Cambridge

    Google Scholar 

  10. Langton SR, Watt RJ, Bruce V (2000) Do the eyes have it? Cues to the direction of social attention. Trends Cogn Sci 4(2):50–59

    Article  Google Scholar 

  11. Lewis M, Haviland-Jones JM, Barrett LF (2010) Handbook of emotions. Guilford Press, New York

    Google Scholar 

  12. Lucas BD, Kanade T (1981) An iterative image registration technique with an application to stereo vision. In: Proceedings of the 7th international joint Conference on Artificial Intelligence, pp 674–679

    Google Scholar 

  13. Metallinou A, Narayanan S (2013) Annotation and processing of continuous emotional attributes: challenges and opportunities. In: 2013 10th IEEE international conference and workshops on automatic face and gesture recognition (FG), pp 1–8

    Google Scholar 

  14. Mota S, Picard RW (2003) Automated posture analysis for detecting learner’s interest level. In: Conference on computer vision and pattern recognition workshop, 2003. CVPRW’03, vol 5. IEEE, pp 49–49

    Google Scholar 

  15. Moubayed SA, Edlund J, Beskow J (2012) Taming Mona Lisa: communicating gaze faithfully in 2d and 3d facial projections. ACM Trans Interact Intell Syst (TiiS) 1(2):11

    Google Scholar 

  16. Nicolaou MA, Gunes H, Pantic M (2010) Automatic segmentation of spontaneous data using dimensional labels from multiple coders. In: Proceedings of LREC int’l workshop on multimodal corpora: advances in capturing, coding and analyzing multimodality, pp 43–48

    Google Scholar 

  17. O’Brien HL, Toms EG (2008) What is user engagement? A conceptual framework for defining user engagement with technology. J Am Soc Inf Sci Technol 59(6):938–955

    Article  Google Scholar 

  18. Pekrun R, Elliot AJ, Maier MA (2009) Achievement goals and achievement emotions: testing a model of their joint relations with academic performance. J Educ Psychol 101(1):115

    Article  Google Scholar 

  19. Peters C (2006) Evaluating perception of interaction initiation in virtual environments using humanoid agents. In: Proceedings of the 2006 conference on ECAI 2006: 17th European conference on artificial intelligence 29 Aug–1 Sept, 2006. IOS Press, Riva del Garda, pp 46–50

    Google Scholar 

  20. Peters C, Asteriadis S, Karpouzis K, de Sevin E (2008) Towards a real-time gaze-based shared attention for a virtual agent. In: Workshop on affective interaction in natural environments (AFFINE), ACM international conference on multimodal interfaces (ICMI08)

    Google Scholar 

  21. Peters C, Asteriadis S, Karpouzis K (2010) Investigating shared attention with a virtual agent using a gaze-based interface. J Multimodal User Interfaces 3:119–130. doi:10.1007/s12193-009-0029-1

    Article  Google Scholar 

  22. Pitsch K, Kuzuoka H, Suzuki Y, Sussenbach L, Luff P, Heath C (2009) The first five seconds: contingent stepwise entry into an interaction as a means to secure sustained engagement in hri. In: The 18th IEEE international symposium on robot and human interactive communication, 2009. RO-MAN 2009, pp 985–991. doi:10.1109/ROMAN.2009.5326167

  23. Qureshi A, Peters C, Apperly I (2013) Interaction and engagement between an agent and participant in an on-line communication paradigm as mediated by gaze direction. In: Proceedings of the 2013 inputs-outputs conference: on engagement in HCI and performance, p 8

    Google Scholar 

  24. Riek LD (2012) Wizard of oz studies in hri: a systematic review and new reporting guidelines. J Hum-Robot Interact 1(1):119–136

    Article  Google Scholar 

  25. Roseman IJ, Smith CA (2001) Appraisal theory: overview, assumptions, varieties, controversies. In: Appraisal processes in emotion: theory, methods, research. Series in affective science, pp 3–19

    Google Scholar 

  26. Ruhland K, Andrist S, Badler J, Peters C, Badler N, Gleicher M, Mutlu B, Mcdonnell R (2014) Look me in the eyes: a survey of eye and gaze animation for virtual agents and artificial systems. In: Eurographics state-of-the-art report. The Eurographics Association, pp 69–91

    Google Scholar 

  27. Shernoff DJ, Csikszentmihalyi M, Schneider B, Shernoff ES (2003) Student engagement in high school classrooms from the perspective of flow theory. Sch Psychol Q 18(2):158–176

    Article  Google Scholar 

  28. Shrout PE, Fleiss JL (1979) Intraclass correlations: uses in assessing rater reliability. Psychol Bull 86(2):420–428

    Article  Google Scholar 

  29. Sidner CL, Dzikovska M (2005) A first experiment in engagement for human-robot interaction in hosting activities. Advances in natural multimodal dialogue systems, pp 55–76

    Google Scholar 

Download references

Acknowledgments

This work was partially supported by the European Commission (EC) and was funded by the EU FP7 ICT-317923 project EMOTE (EMbOdied-perceptive Tutors for Empathy-based learning) and the EU Horizon 2020 ICT-644204 project ProsocialLearn. The authors are solely responsible for the content of this publication. It does not represent the opinion of the EC, and the EC is not responsible for any use that might be made of data appearing therein.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lee J. Corrigan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Corrigan, L.J., Peters, C., Küster, D., Castellano, G. (2016). Engagement Perception and Generation for Social Robots and Virtual Agents. In: Esposito, A., Jain, L. (eds) Toward Robotic Socially Believable Behaving Systems - Volume I . Intelligent Systems Reference Library, vol 105. Springer, Cham. https://doi.org/10.1007/978-3-319-31056-5_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-31056-5_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-31055-8

  • Online ISBN: 978-3-319-31056-5

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics