Personal and Ubiquitous Computing

, Volume 20, Issue 1, pp 51–63 | Cite as

Robotic experience companionship in music listening and video watching

  • Guy Hoffman
  • Shira Bauman
  • Keinan Vanunu
Original Article


We propose the notion of robotic experience companionship (REC): a person’s sense of sharing an experience with a robot. Does a robot’s presence and response to a situation affect a human’s understanding of the situation and of the robot, even without direct human–robot interaction? We present the first experimental assessment of REC, studying people’s experience of entertainment media as they share it with a robot. Both studies use an autonomous custom-designed desktop robot capable of performing gestures synchronized to the media. Study I (\(n=67\)), examining music listening companionship, finds that the robot’s dance-like response to music causes participants to feel that the robot is co-listening with them, and increases their liking of songs. The robot’s response also increases its perceived human character traits. We find REC to be moderated by music listening habits, such that social listeners were more affected by the robot’s response. Study II (\(n=91\)), examining video watching companionship supports these findings, demonstrating that social video viewers enjoy the experience more with the robot present, while habitually solitary viewers do not. Also in line with Study I, the robot’s response to the video clip causes people to attribute more positive human character traits to the robot. This has implications for robots as companions for digital media consumption, but also suggests design implications based on REC for other shared experiences with personal robots.


Human–robot interaction Social robotics Music listening Video watching Digital companions Social referencing 



The authors would like to thank Shoshana Krug for assistance in running the study, as well as Avital Mentovich and Oren Zuckerman for valuable comments on an earlier draft of this paper.


  1. 1.
    Acosta L, González E, Rodríguez JN, Hamilton AF et al (2006) Design and implementation of a service robot for a restaurant. Int J Robot Autom 21(4):273Google Scholar
  2. 2.
    Bailenson JN, Yee N (2005) Digital chameleons: automatic assimilation of nonverbal gestures in immersive virtual environments. Psychol Sci 16(10):814–9CrossRefGoogle Scholar
  3. 3.
    Banjo OO, Appiah O, Wang Z, Brown C, Walther WO (2015) Co-viewing effects of ethnic-oriented programming: an examination of in-group bias and racial comedy exposure. J Mass Commun Quart 92(3):662–680Google Scholar
  4. 4.
    Bainbridge WA, Hart J, Kim ES, Scassellati B (2008) The effect of presence on human–robot interaction. In: RO-MAN 2008—the 17th IEEE international symposium on robot and human interactive communication. IEEE, Aug 2008Google Scholar
  5. 5.
    Banjo OO (2013) For us only? Examining the effect of viewing context on black audiences perceived influence of black entertainment. Race Soc Probl 5(4):309–322CrossRefGoogle Scholar
  6. 6.
    Bemelmans R, Gelderblom GJ, Jonker P, de Witte L (2012) Socially assistive robots in elderly care: a systematic review into effects and effectiveness. J Am Med Dir Assoc 13(2):114–120.e1CrossRefGoogle Scholar
  7. 7.
    Biocca F, Harms C, Burgoon JK (2003) Towards a more robust theory and measure of social presence : review and suggested criteria. Presence Teleoperators Virtual Environ 12(5):456–480CrossRefGoogle Scholar
  8. 8.
    Bore I-LK (2011) Laughing together? TV comedy audiences and the laugh track. Velvet Light Trap 68:24–34CrossRefGoogle Scholar
  9. 9.
    Breazeal C (2004) Social interactions in HRI: the robot view. IEEE Trans SMC Part C Spec Issue Hum Robot Interact 34(2):181–186Google Scholar
  10. 10.
    Bretan M, Hoffman G, Weinberg G (2015) Emotionally expressive dynamic physical behaviors in robots. Int J Hum Comput Stud 78:1–16CrossRefGoogle Scholar
  11. 11.
    Bretan M, Weinberg G (2014) Chronicles of a robotic musical companion. In: Proceedings of the 2014 conference on New interfaces for musical expression. University of LondonGoogle Scholar
  12. 12.
    Bruyn LD, Leman M, Moelants D (2009) Does social interaction activate music listeners? In: Ystad S, Kronland-Marinet R, Jensen K (eds) CMMR 2008. Springer, BerlinGoogle Scholar
  13. 13.
    Burke J, Coovert M, Murphy R, Riley J, Rogers E (2006) Human–robot factors: robots in the workplace. In: Proceedings of the human factors and ergonomics society annual meeting, vol 50, Oct 2006Google Scholar
  14. 14.
    Cottrell NB, Rittle RH, Wack DL (1967) The presence of an audience and list type (competitional or noncompetitional) as joint determinants of performance in paired-associates learning. J Pers 35(3):425–434CrossRefGoogle Scholar
  15. 15.
    Cottrell NB, Wack DL, Sekerak GJ, Rittle RH (1968) Social facilitation of dominant responses by the presence of an audience and the mere presence of others. J Pers Soc Psychol 9(3):245CrossRefGoogle Scholar
  16. 16.
    Dang T-H-H, Tapus A (2014) Towards personality-based assistance in human–machine interaction. In: RO-MAN: the 23rd IEEE international symposium on robot and human interactive communication, 2014. IEEE, 2014Google Scholar
  17. 17.
    Dautenhahn K (1999) Robots as social actors: aurora and the case of autism. In: Proceedings of CT99, the third international cognitive technology conference, August, San Francisco, vol 359Google Scholar
  18. 18.
    Feil-Seifer D, Mataric M (2011) Socially assistive robotics. IEEE Robot Autom Mag 18(1):24–31CrossRefGoogle Scholar
  19. 19.
    Feinman S (1982) Social referencing in infancy. Merrill Palmer Q 28(4):445–470Google Scholar
  20. 20.
    Feinman S (1983) How does baby socially refer? Two views of social referencing: a reply to campos. Merrill Palmer Q 1982:467–471Google Scholar
  21. 21.
    Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42(3–4):143–166CrossRefzbMATHGoogle Scholar
  22. 22.
    Forlizzi J (2007) How robotic products become social products: an ethnographic study of cleaning in the home. In: Proceedings of the ACM/IEEE international conference on Human–robot interaction. ACMGoogle Scholar
  23. 23.
    Fukuda T, Jung M-J, Nakashima M, Arai F, Hasegawa Y (2004) Facial expressive robotic head system for human–robot communication and its application in home environment. Proc IEEE 92(11):1851–1865CrossRefGoogle Scholar
  24. 24.
    Guadagno RE, Cialdini RB (2002) Online persuasion: an examination of gender differences in computer-mediated interpersonal influence. Group Dyn Theory Res Pract 6(1):38–51CrossRefGoogle Scholar
  25. 25.
    Haridakis P, Hanson G (2009) Social interaction and co-viewing with YouTube: blending mass communication reception and social connection. J Broadcast Electron Media 53(2):317–335CrossRefGoogle Scholar
  26. 26.
    Heerink M, Ben K, Evers V, Wielinga B (2008) The influence of social presence on acceptance of a companion robot by older people. J Phys Agents 2(2):33–40Google Scholar
  27. 27.
    Hocking JE, Margreiter DG, Hylton C (1977) Intra-audience effects: a field test. Hum Commun Res 3(3):243–249CrossRefGoogle Scholar
  28. 28.
    Hoffman G (2012) Dumb robots, smart phones : a case study of music listening companionship. In: RO-MAN 2012—the IEEE international symposium on robot and human interactive communicationGoogle Scholar
  29. 29.
    Hoffman G (2013) Evaluating fluency in human–robot collaboration. In: Robotics: science and systems (RSS’13) workshop on human–robot collaborationGoogle Scholar
  30. 30.
    Hoffman G, Vanunu K (2013) Effects of robotic companionship on music enjoyment and agent perception. In: Proceedings of the 8th ACM/IEEE international conference on Human–robot interaction (HRI)Google Scholar
  31. 31.
    Iwamura Y, Shiomi M, Kanda T, Ishiguro H, Hagita N (2011) Do elderly people prefer a conversational humanoid as a shopping assistant partner in supermarkets? In: Proceedings of the 6th international conference on Human–robot interaction - HRI ’11, New York, New York, USA, 2011. ACM PressGoogle Scholar
  32. 32.
    Kanda T, Hirano T, Eaton D, Ishiguro H (2004) Interactive robots as social partners and peer tutors for children: a field trial. Hum Comput Interact 19:61–84CrossRefGoogle Scholar
  33. 33.
    Kidd C, Breazeal C (2004) Effect of a robot on user perceptions. In: Proceedings of theIEEE/RSJ international conference on intelligent robots and systems (IROS2004)Google Scholar
  34. 34.
    Kidd CD (2003) Sociable robots: the role of presence and task in human–robot interaction. Ph.D. Thesis, MITGoogle Scholar
  35. 35.
    Larson R, Kubey R (1983) Television and music: contrasting media in adolescent life. Youth Soc 15(1):13–31CrossRefGoogle Scholar
  36. 36.
    Lee KM, Peng W, Jin S-A, Yan C (2006) Can robots manifest personality?: an empirical test of personality recognition, social responses, and social presence in human–robot interaction. J Commun 56(4):754–772CrossRefGoogle Scholar
  37. 37.
    Mead R, Atrash A, Mataric MJ (2011) Recognition of spatial dynamics for predicting social interaction. In: Proceedings of the 6th international conference on human–robot interaction - HRI ’11. New York, New York, USA, 2011. ACM PressGoogle Scholar
  38. 38.
    Michalowski M, Sabanovic S, Kozima H (2007) A dancing robot for rhythmic social interaction. In: HRI ’07: Proceedings of the ACM/IEEE international conference on human–robot interaction. Arlington, Virginia, USA, Mar 2007Google Scholar
  39. 39.
    Mora J-D, Ho J, Krider R (2011) Television co-viewing in Mexico: an assessment on people meter data. J Broadcast Electron Media 55(4):448–469CrossRefGoogle Scholar
  40. 40.
    Morales Saiki LY, Satake S, Huq R, Glass D, Kanda T, Hagita N (2012) How do people walk side-by-side? In: Proceedings of the seventh annual ACM/IEEE international conference on human–robot interaction—HRI ’12. New York, New York, USA, 2012. ACM PressGoogle Scholar
  41. 41.
    North AC, Hargreaves DJ, Hargreaves JJ (2004) Uses of music in everyday life. Music Percept 22(1):41–77CrossRefGoogle Scholar
  42. 42.
    O’Hara K, Brown B (eds) (2006) Consuming music together, computer supported cooperative work, vol 35. Springer, BerlinGoogle Scholar
  43. 43.
    Paavonen EJ, Roine M, Pennonen M, Lahikainen AR (2009) Do parental co-viewing and discussions mitigate TV-induced fears in young children? Child Care Health Dev 35(6):773–780CrossRefGoogle Scholar
  44. 44.
    Pacchierotti E, Christensen HI, Jensfelt P (2006) Design of an office-guide robot for social interaction studies. In: International conference on intelligent robots and systems, 2006 IEEE/RSJ. IEEEGoogle Scholar
  45. 45.
    Platow MJ, Haslam SA, Both A, Chew I, Cuddon M, Goharpey N, Maurer J, Rosini S, Tsekouras A, Grace DM (2005) Its not funny if theyre laughing: self-categorization, social influence, and responses to canned laughter. J Exp Soc Psychol 41(5):542–550CrossRefGoogle Scholar
  46. 46.
    Rubin AM, Rubin RB (1985) Interface of personal and mediated communication: a research agenda. Crit Stud Media Commun 2(1):36–53CrossRefGoogle Scholar
  47. 47.
    Skouteris H, Kelly L (2006) Repeated-viewing and co-viewing of an animated video: an examination of factors that impact on young children’s comprehension of video content. Aust J Early Child 31(3):22–30Google Scholar
  48. 48.
    Slater M, Sadagic A, Usoh M, Schroeder R (2000) small group behaviour in a virtual and real environment : a comparative study. Presence 9:37–51CrossRefGoogle Scholar
  49. 49.
    Sorce JF, Emde RN, Campos JJ, Klinnert MD (1985) Maternal emotional signaling: its effect on the visual cliff behavior of 1-year-olds. Dev Psychol 21(1):195CrossRefGoogle Scholar
  50. 50.
    Spexard T, Li S, Wrede B, Fritsch J, Sagerer G, Booij O, Zivkovic Z, Terwijn B, Krose B (2006) BIRON, where are you? Enabling a robot to learn new places in a real home environment by integrating spoken dialog and visual localization. In: International conference on intelligent robots and systems, 2006 IEEE/RSJ. IEEEGoogle Scholar
  51. 51.
    Tal-Or N, Tsfati Y. Does the co-viewing of sexual material affect rape myth acceptance? The role of the co-viewer’s reactions and gender. Manuscript submitted for publicationGoogle Scholar
  52. 52.
    Tanaka F, Ghosh M (2011) The implementation of care-receiving robot at an English learning school for children. In: 2011 6th ACM/IEEE international conference on human–robot interaction (HRI). IEEEGoogle Scholar
  53. 53.
    Tapus A, Tapus C, Mataric MJ (2009) The use of socially assistive robots in the design of intelligent cognitive therapies for people with dementia. In: IEEE international conference on rehabilitation robotics, 2009. ICORR 2009. IEEE June 2009Google Scholar
  54. 54.
    Thrun S (2004) Toward a framework for human–robot interaction. Hum Comput Interact 19:9–24CrossRefGoogle Scholar
  55. 55.
    Wada K, Shibata T, Musha T, Kimura S (2008) Robot therapy for elders affected by dementia. IEEE Eng Med Biol Mag 27(4):53–60CrossRefGoogle Scholar
  56. 56.
    Weisz JD, Kiesler S, Zhang H, Ren Y, Kraut RE, Konstan JA (2007) Watching together: integrating text chat with video. In: Proceedings of the SIGCHI conference on Human factors in computing systems. ACMGoogle Scholar
  57. 57.
    Zajonc RB et al (1965) Social facilitation. Research Center for Group Dynamics, Institute for Social Research, University of MichiganGoogle Scholar
  58. 58.
    Zarbatany L, Lamb ME (1985) Social referencing as a function of information source: mothers versus strangers. Infant Behav Dev 8(1):25–33CrossRefGoogle Scholar

Copyright information

© Springer-Verlag London 2016

Authors and Affiliations

  1. 1.Media Innovation LabIDC HerzliyaHerzliyaIsrael
  2. 2.The MITRE CorporationMcLeanUSA

Personalised recommendations