Advertisement

Regulation and Entrainment in Human-Robot Interaction

  • Cynthia Breazeal
Conference paper
Part of the Lecture Notes in Control and Information Sciences book series (LNCIS, volume 271)

Abstract

Newly emerging robotics applications for domestic or entertainment purposes are slowly introducing autonomous robots into society at large. A critical capability of such robots is their ability to interact with humans, and in particular, untrained users. This paper explores the hypothesis that people will intuitively interact with robots in a natural social manner provided the robot can perceive, interpret, and appropriately respond with familiar human social cues. Two experiments are presented where naive human subjects interact with an anthropomorphic robot. Evidence for mutual regulation and entrainment of the interaction is presented, and how this benefits the interaction as a whole is discussed.

Keywords

Facial Expression Humanoid Robot High Pitch Pitch Contour Naive Subject 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    B. Reeves and C. Nass 1996, The Media Equation. CSLI Publications. Stanford, CA.Google Scholar
  2. [2]
    J. Cassell 2000, “Nudge Nudge Wink Wink: Elements of face-to-face conversation for embodied conversational agents”. In: J. Cassell, J. Sullivan, S. Prevost & E. Churchill (eds.) Embodied Conversational Agents, MIT Press, Cambridge, MA.Google Scholar
  3. [3]
    F. Hara 1998, “Personality characterization of animate face robot through interactive communication with human”. In: Proceedings of IARP98. Tsukuba, Japan. pp IV–1.Google Scholar
  4. [4]
    H. Takanobu, A. Takanishi, S. Hirano, I. Kato, K. Sato, and T. Umetsu 1998, “Development of humanoid robot heads for natural human-robot communication”. In: Proceedings of HURO98. Tokyo, Japan. pp 21–28.Google Scholar
  5. [5]
    Y. Matsusaka and T. Kobayashi 1999, “Human interface of humanoid robot realizing group communication in real space”. In: Proceedings of HURO99. Tokyo, Japan. pp. 188–193.Google Scholar
  6. [6]
    P. Menzel and F. D’Alusio 2000, Robosapiens. MIT Press.Google Scholar
  7. [7]
    A. Fernald 1985, “Four-month-old Infants Prefer to Listen to Motherese”. In Infant Behavior and Development, vol 8. pp 181–195.CrossRefGoogle Scholar
  8. [8]
    Papousek, M., Papousek, H., Bornstein, M.H. 1985, The Naturalistic Vocal Environment of Young Infants: On the Significance of Homogeneity and Variability in Parental Speech. In: Field, T., Fox, N. (eds.) Social Perception in Infants. Ablex, Norwood NJ. 269–297.Google Scholar
  9. [9]
    Ferrier, L.J. 1987, Intonation in Discourse: Talk Between 12-month-olds and Their Mothers. In: K. Nelson(Ed.) Children’s language, vol.5. Erlbaum, Hillsdale NJ. 35–60.Google Scholar
  10. [10]
    Stern, D.N., Spieker, S., MacKain, K. 1982, Intonation Contours as Signals in Maternal Speech to Prelinguistic Infants. Developmental Psychology, 18: 727–735.CrossRefGoogle Scholar
  11. [11]
    Vlassis, N., Likas, A. 1999, A Kurtosis-Based Dynamic Approach to Gaussian Mixture Modeling. In: IEEE Trans. on Systems, Man, and Cybernetics. Part A: Systems and Humans, Vol. 29: No.4.Google Scholar
  12. [12]
    C. Breazeal & L. Aryananda 2000, “Recognition of Affective Communicative Intent in Robot-Directed Speech”. In: Proceedings of the 1st International Conference on Humanoid Robots (Humanoids 2000). Cambridge, MA.Google Scholar
  13. [13]
    C. Breazeal 2000, “Believability and Readability of Robot Faces”. In: Proceedings of the 8th International Symposium on Intelligent Robotic Systems (SIRS 2000). Reading, UK, 247–256.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2001

Authors and Affiliations

  • Cynthia Breazeal
    • 1
  1. 1.MIT Artificial Intelligence LabCambridgeUSA

Personalised recommendations