International Journal of Social Robotics

, Volume 5, Issue 4, pp 477–490 | Cite as

Assessing Interaction Dynamics in the Context of Robot Programming by Demonstration

  • Ana Lucia Pais
  • Brenna D. Argall
  • Aude G. Billard
Article

Abstract

In this paper we focus on human–robot interaction peculiarities that occur during programming by demonstration. Understanding what makes the interaction rewarding and keeps the user engaged helps optimize the robot’s learning. Two user studies are presented. The first one validates facially displayed expressions on the iCub robot. The best recognized displays are then used in a second study, along with other ways of providing feedback during teaching a manipulation task to a robot. We determine the preferred and more effective way of providing feedback in relation to the robot’s tactile sensing, in order to improve the teaching interaction and to keep the users engaged throughout the interaction.

Keywords

Robot programming by demonstration Robot facial displays Emotion expression Interaction dynamics Incremental learning 

References

  1. 1.
    Bartneck C, Reichenbach J, Breemen AV (2004) In your face, robot! The influence of a character’s embodiment on how users perceive its emotional expressions. In: Design and emotion 2004 conference Google Scholar
  2. 2.
    Bartneck C, Kanda T, Mubin O, Mahmud AA (2009) Does the design of a robot influence its animacy and perceived intelligence? Int J Soc Robot 1(2):195–204 CrossRefGoogle Scholar
  3. 3.
    Bicchi A (1995) On the closure properties of robotic grasping. Int J Robot Res 14(4):319–334 CrossRefGoogle Scholar
  4. 4.
    Breazeal C (2009) Role of expressive behaviour for robots that learn from people. Philos Trans R Soc Lond B, Biol Sci 364:3527–3538 CrossRefGoogle Scholar
  5. 5.
    Brooke J (1996) SUS—a quick and dirty usability scale. In: Usability evaluation in industry, pp 189–194 Google Scholar
  6. 6.
    Cakmak M, Thomaz AL (2012) Designing robot learners that ask good questions. In: HRI, pp 17–24 CrossRefGoogle Scholar
  7. 7.
    Calinon S, Billard A (2007) What is the teacher’s role in robot programming by demonstration?—Toward benchmarks for improved learning. Interact Stud 8(3):441–464, Special issue on psychological benchmarks in human–robot interaction CrossRefGoogle Scholar
  8. 8.
    Cramer H, Goddijn J, Wielinga B, Evers V (2010) Effects of (in)accurate empathy and situational valence on attitudes towards robots. In: Proceedings of the 5th ACM/IEEE international conference on human–robot interaction, pp 141–142 Google Scholar
  9. 9.
    Dautenhahn K (1998) The art of designing socially intelligent agents—science, fiction and the human in the loop. Appl Artif Intell J 12:12–17 special issue on socially intelligent agents Google Scholar
  10. 10.
    Dautenhahn K, Werry I (2000) Issues of robot-human interaction dynamics in the rehabilitation of children with autism Google Scholar
  11. 11.
    Gielniak MJ, Thomaz AL (2011) Spatiotemporal correspondence as a metric for human-like robot motion. In: Proceedings of the 6th international conference on human–robot interaction, pp 77–84 Google Scholar
  12. 12.
    Giusti L, Marti P (2006) Interpretative dynamics in human robot interaction. In: Robot and human interactive communication, ROMAN, pp 111–116 Google Scholar
  13. 13.
    Hanson D (2005) Expanding the aesthetics possibilities for humanlike robots. In: Proc IEEE “Humanoid robotics” conference. special session on the Uncanny Valley Google Scholar
  14. 14.
    Hart S, Staveland L (1988) Development of NASA-tlx (task load index): results of empirical and theoretical research. In: Human mental workload, pp 139–183 CrossRefGoogle Scholar
  15. 15.
    Hassenzahl M, Burmester M, Koller F (2003) Attrakdiff: Ein fragebogen zur messung wahrgenommener hedonischer und pragmatischer qualität. In: Mensch & computer 2003: interaktion in bewegung, vol 196. B.G. Teubner, Leipzig, p 187 CrossRefGoogle Scholar
  16. 16.
    Knox WB, Glass BD, Love BC, Maddox WT, Stone P (2012) How humans teach agents—a new experimental perspective. Int J Soc Robot 4(4):409–421 CrossRefGoogle Scholar
  17. 17.
    Leyzberg D, Avrunin E, Liu J, Scassellati B (2011) Robots that express emotion elicit better human teaching. In: Proceedings of the 6th international conference on human–robot interaction, HRI ’11, pp 347–354 CrossRefGoogle Scholar
  18. 18.
    Likert R (1932) A technique for the measurement of attitudes. Arch Psychol 22(140):1–55 Google Scholar
  19. 19.
    Nicolescu MN, Mataric MJ (2003) Natural methods for robot task learning: instructive demonstrations, generalization and practice. In: Proceedings of the second international joint conference on autonomous agents and multiagent systems, pp 241–248 CrossRefGoogle Scholar
  20. 20.
    Peacock M (2001) Match or mismatch? Learning styles and teaching styles in efl. Int J Appl Linguist 11(1):1–20 CrossRefGoogle Scholar
  21. 21.
    Ponce J, Sullivan S, Sudsang A, daniel Boissonnat J, Merlet JP (1996) On computing four-finger equilibrium and force-closure grasps of polyhedral objects. Int J Robot Res 16:11–35 CrossRefGoogle Scholar
  22. 22.
    Robins B, Dautenhahn K, Nehaniv CL, Mirza NA, Francois D, Olsson L (2005) Sustaining interaction dynamics and engagement in dyadic child-robot interaction kinesics: lessons learnt from an exploratory study. In: IEEE international workshop on ROMAN 2005, pp 716–722 Google Scholar
  23. 23.
    Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161–1198 CrossRefGoogle Scholar
  24. 24.
    Russell J (1991) Culture and the categorization of emotions. Psychol Bull 110(3):426–450 CrossRefGoogle Scholar
  25. 25.
    Russell J (1997) Reading emotions from and into faces: resurrecting a dimensional-contextual perspective. In: Russell JA, Fernández-Dols JM (eds) The psychology of facial expression. Cambridge University Press, Cambridge, pp 295–320 CrossRefGoogle Scholar
  26. 26.
    Sauser EL, Argall BD, Metta G, Billard AG (2012) Iterative learning of grasp adaptation through human corrections. Robot Auton Syst 60:55–71 CrossRefGoogle Scholar
  27. 27.
    Ushida H (2010) Effect of social robot’s behavior in collaborative learning. In: Proceedings of the 5th ACM/IEEE international conference on human–robot interaction, HRI ’10, pp 195–196 CrossRefGoogle Scholar
  28. 28.
    Yohanan S, MacLean KE (2011) Design and assessment of the haptic creature’s affect display. In: Proceedings of the 6th international conference on human–robot interaction, HRI ’11, pp 473–480 CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2013

Authors and Affiliations

  • Ana Lucia Pais
    • 1
  • Brenna D. Argall
    • 2
  • Aude G. Billard
    • 1
  1. 1.Learning Algorithms and Systems LaboratoryÉcole Polytechnique Fédérale de LausanneLausanneSwitzerland
  2. 2.Departments of EECS and PMRNorthwestern UniversityChicagoUSA

Personalised recommendations