Skip to main content

Social Robots and Dark Patterns: Where Does Persuasion End and Deception Begin?

  • Chapter
  • First Online:
Book cover Artificial Intelligence in Brain and Mental Health: Philosophical, Ethical & Policy Issues

Part of the book series: Advances in Neuroethics ((AIN))

Abstract

Socially interactive robots are steadily entering the realm of human experience. Social robots are being deployed for applications in entertainment and companionship, health and well-being. We argue that the design of social robots is not neutral as they are exemplars of persuasive technologies. By virtue of its physical embodiment and its anthropomorphic design which take into account an intimate understanding of human and social psychology, communication and behavior, they persuade human beings into the illusion of a mutually empathic and socially reciprocative interaction. In recent years, a design trend has emerged in digital media and web technologies: Dark Patterns. They use deceptive techniques to persuade people to commit actions that are not in their best interest and undermine their autonomy to further the interest of designers, companies or their shareholders. We highlight that social robots may harbor these dark patterns and they can be potent social actors that can push our darwinian buttons to undermine our time, monetary situation and privacy, effectively harming our health and well-being.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Robinson NL, Turkay S, Cooper LAN, Johnson D. Social robots with gamification principles to increase long-term user interaction. 2019. p. 5.

    Google Scholar 

  2. International Federation of Robotics. Executive summary world robotics 2019 service robots. 2019. https://ifr.org/downloads/press2018/Executive_Summary_WR_Service_Robots_2019.pdf.

  3. Fogg BJ. Persuasive technologies. Commun ACM. 1999;42(5):26–9. https://doi.org/10.1145/301353.301396.

  4. Krieger MJB, Billeter J-B, Keller L. Ant-like task allocation and recruitment in cooperative robots. Nature. 2000;406(6799):992–5. https://doi.org/10.1038/35023164.

    Article  CAS  PubMed  Google Scholar 

  5. Kube CR, Zhang H. Collective robotics: from social insects to robots. Adapt Behav. 1993;2(2):189–218. https://doi.org/10.1177/105971239300200204.

    Article  Google Scholar 

  6. Fong T, Nourbakhsh I, Dautenhahn K. A survey of socially interactive robots. Robot Auton Syst. 2003;42(3–4):143–66. https://doi.org/10.1016/S0921-8890(02)00372-X.

    Article  Google Scholar 

  7. Hung L, Liu C, Woldum E, Au-Yeung A, Berndt A, Wallsworth C, Horne N, Gregorio M, Mann J, Chaudhury H. The benefits of and barriers to using a social robot PARO in care settings: a scoping review. BMC Geriatr. 2019;19(1):232. https://doi.org/10.1186/s12877-019-1244-6.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Kaplan F. Free creatures: the role of uselessness in the design of artificial pets. In: Proceedings of the 1st edutainment workshop. 2000.

    Google Scholar 

  9. Pandey AK, Gelin R. A mass-produced sociable humanoid robot: pepper: the first machine of its kind. IEEE Robot Automat Magaz. 2018;25(3):40–8. https://doi.org/10.1109/MRA.2018.2833157.

    Article  Google Scholar 

  10. Leite I, Martinho C, Paiva A. Social robots for long-term interaction: a survey. Int J Soc Robot. 2013;5(2):291–308. https://doi.org/10.1007/s12369-013-0178-y.

    Article  Google Scholar 

  11. Carros F, Meurer J, Löffler D, Unbehaun D, Matthies S, Koch I, Wieching R, Randall D, Hassenzahl M, Wulf V. Exploring human-robot interaction with the elderly: results from a ten-week case study in a care home. In: Proceedings of the 2020 CHI conference on human factors in computing systems. Honolulu, HI: ACM; 2020. p. 1–12. https://doi.org/10.1145/3313831.3376402.

  12. Kaplan F. Artificial attachment: will a robot ever pass Ainsworth’s strange situation test? In: Proceedings of humanoids. 2001. p. 125–32.

    Google Scholar 

  13. Kertész C, Turunen M. Exploratory analysis of sony AIBO users. AI Soc. 2019;34(3):625–38. https://doi.org/10.1007/s00146-018-0818-8.

    Article  Google Scholar 

  14. Dennett DC. The intentional stance. MIT press; 1989.

    Google Scholar 

  15. Damiano L, Dumouchel P. Anthropomorphism in human–robot co-evolution. Front Psychol. 2018;9:468. https://doi.org/10.3389/fpsyg.2018.00468.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Duffy BR. Anthropomorphism and the social robot. Robot Auton Syst. 2003;42(3–4):177–90. https://doi.org/10.1016/S0921-8890(02)00374-3.

    Article  Google Scholar 

  17. Dale JP, Goggin J, Leyda J, McIntyre AP, Negra D. The aesthetics and affects of cuteness. 1st ed. New York: Routledge; 2016. https://doi.org/10.4324/9781315658520.

    Book  Google Scholar 

  18. Anderson SP. Towards an ethics of persuasion. UX Magazine, Dec 13, 2011. https://uxmag.com/articles/towards-an-ethics-of-persuasion.

  19. Gray CM, Kou Y, Battles B, Hoggatt J, Toombs AL. The dark (patterns) side of UX design. 2018. p. 14.

    Google Scholar 

  20. Breazeal C. Social robots for health applications. In: 2011 annual international conference of the IEEE engineering in medicine and biology society. Boston, MA: IEEE; 2011. p. 5368–71. https://doi.org/10.1109/IEMBS.2011.6091328.

  21. Breazeal C. The vision of sociable robots. In: Designing sociable robots. MIT Press; 2002.

    Google Scholar 

  22. Ham J, Midden CJH. A persuasive robot to stimulate energy conservation: the influence of positive and negative social feedback and task similarity on energy-consumption behavior. Int J Soc Robot. 2014;6(2):163–71. https://doi.org/10.1007/s12369-013-0205-z.

    Article  Google Scholar 

  23. Siegel M, Breazeal C, Norton MI. Persuasive robotics: the influence of robot gender on human behavior. In: 2009 IEEE/RSJ international conference on intelligent robots and systems. St. Louis, MO: IEEE; 2009. p. 2563–68. https://doi.org/10.1109/IROS.2009.5354116.

  24. Winkle K, Lemaignan S, Caleb-Solly P, Leonards U, Turton A, Bremner P. Effective persuasion strategies for socially assistive robots. In: 2019 14th ACM/IEEE international conference on human-robot interaction (HRI). IEEE; 2019. p. 277–85.

    Google Scholar 

  25. Rincon JA, Costa A, Novais P, Julian V, Carrascosa C. A new emotional robot assistant that facilitates human interaction and persuasion. Knowl Inf Syst. 2019;60(1):363–83. https://doi.org/10.1007/s10115-018-1231-9.

    Article  Google Scholar 

  26. Zagal JP, Björk S, Lewis C. Dark patterns in the design of games. In: Proceedings of foundations of digital games. 2013. p. 8.

    Google Scholar 

  27. Schüll ND. Addiction by design: machine gambling in Las Vegas. Princeton University Press; 2014.

    Google Scholar 

  28. Bösch C, Erb B, Kargl F, Kopp H, Pfattheicher S. Tales from the Dark side: privacy dark strategies and privacy dark patterns. In: Proceedings on privacy enhancing technologies. 2016. p. 237–54.

    Google Scholar 

  29. Harris T. How technology is hijacking your mind — from a magician and Google design ethicist. Medium Magazine. May 18, 2016. https://medium.com/thrive-global/how-technology-hijacks-peoples-minds-from-a-magician-and-google-s-design-ethicist-56d62ef5edf3.

  30. Center for Humane Technology. Human Design Guide (Alpha Version). 2019. https://humanetech.com/designguide.

  31. Mathur A, Acar G, Friedman MJ, Lucherini E, Mayer J, Chetty M, Narayanan A. Dark patterns at scale: findings from a crawl of 11K shopping websites. ArXiv:1907.07032 [Cs]. 2019. https://doi.org/10.1145/3359183.

  32. Forbrukerrådet (Consumer Council of Norway). Deceived By Design how tech companies use dark patterns to discourage us from exercising our rights to privacy. 2018. https://www.forbrukerradet.no/.

  33. Appel H, Gerlach AL, Crusius J. The interplay between facebook use, social comparison, envy, and depression. Curr Opin Psychol. 2016;9:44–9.

    Article  Google Scholar 

  34. Woods HC, Scott H. #Sleepyteens: social media use in adolescence is associated with poor sleep quality, anxiety, depression and low self-esteem. J Adolesc. 2016;51:41–9. https://doi.org/10.1016/j.adolescence.2016.05.008.

    Article  PubMed  Google Scholar 

  35. Nodder C. Evil by design: interaction design to lead us into temptation. Wiley; 2013.

    Google Scholar 

  36. Lewis C. Irresistible apps - motivational design patterns for apps, games, and web-based communities. Apress; 2014.

    Google Scholar 

  37. Hoover R, Eyal N. Hooked - how to build habit-forming products. Portfolio; 2014.

    Google Scholar 

  38. Anderson SP. Seductive interaction design: creating playful, fun, and effective user experiences. New Riders; 2011.

    Google Scholar 

  39. Booth S, Tompkin J, Pfister H, Waldo J, Gajos K, Nagpal R. Piggybacking robots: human-robot overtrust in university dormitory security. In: Proceedings of the 2017 ACM/IEEE International conference on human-robot interaction - HRI ‘17. Vienna: ACM Press; 2017. p. 426–34. https://doi.org/10.1145/2909824.3020211.

  40. Sanoubari E, Seo SH, Garcha D, Young JE, Loureiro-Rodriguez V. Good robot design or machiavellian? An in-the-wild robot leveraging minimal knowledge of passersby’s culture. In: 2019 14th ACM/IEEE international conference on human-robot interaction (HRI). Daegu: IEEE; 2019, p. 382–91. https://doi.org/10.1109/HRI.2019.8673326.

  41. Postnikoff B, Goldberg I. Robot social engineering: attacking human factors with non-human actors. In: Companion of the 2018 ACM/IEEE international conference on human-robot interaction - HRI ‘18. Chicago, IL: ACM Press; 2018, p. 313–14. https://doi.org/10.1145/3173386.3176908.

  42. Chromik M, Eiband M, Völkel ST, Buschek D. Dark patterns of explainability, transparency, and user control for intelligent systems. In: Joint proceedings of the ACM IUI 2019 Workshops. 2019. p. 7.

    Google Scholar 

  43. Sony Electronics Inc. Aibo end user agreement. Sony Electronics Inc. 2019. https://us.aibo.com/terms/pdf/terms-ai.pdf.

  44. Sony Electronics Inc. Aibo privacy policy. 2019. https://us.aibo.com/terms/pdf/aibo-privacy.pdf.

  45. Obar JA, Oeldorf-Hirsch A. The biggest lie on the internet: ignoring the privacy policies and terms of service policies of social networking services. Inf Commun Soc. 2020;23(1):128–47. https://doi.org/10.1080/1369118X.2018.1486870.

    Article  Google Scholar 

  46. Abdi J, Al-Hindawi A, Ng T, Vizcaychipi MP. Scoping review on the use of socially assistive robot technology in elderly care. BMJ Open. 2018;8(2):e018815. https://doi.org/10.1136/bmjopen-2017-018815.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Hamada T, Okubo H, Inoue K, MaruyamaJ, Onari H, Kagawa Y, Hashimoto T. Robot therapy as for recreation for elderly people with dementia - game recreation using a pet-type robot. In: RO-MAN 2008 - The 17th IEEE international symposium on robot and human interactive communication. Munich: IEEE; 2008. p. 174–79. https://doi.org/10.1109/ROMAN.2008.4600662.

  48. Kramer SC, Friedmann E, Bernstein PL. Comparison of the effect of human interaction, animal-assisted therapy, and AIBO-assisted therapy on long-term care residents with dementia. Anthrozoös. 2009;22(1):43–57. https://doi.org/10.2752/175303708X390464.

    Article  Google Scholar 

  49. Kanamori M, Suzuki M, Oshiro H, Tanaka M, Inoguchi T, Takasugi H, Saito Y, Yokoyama T. Pilot study on improvement of quality of life among elderly using a pet-type robot. In: Proceedings 2003 IEEE international symposium on computational intelligence in robotics and automation. computational intelligence in robotics and automation for the new millennium (Cat. No.03EX694), vol. 1. Kobe: IEEE; p. 107–12. 2003. https://doi.org/10.1109/CIRA.2003.1222072.

  50. Decuir JD, Kozuki T, Matsuda V, Piazza J. A friendly face in robotics: sony’s AIBO entertainment robot as an educational tool. Comput Entertain. 2004;2(2):14. https://doi.org/10.1145/1008213.1008236.

    Article  Google Scholar 

  51. Giaretta A, De Donno M, Dragoni N. Adding salt to pepper: a structured security assessment over a humanoid robot. In: Proceedings of the 13th international conference on availability, reliability and security - ARES 2018. Hamburg: ACM Press; 2018. p. 1–8. https://doi.org/10.1145/3230833.3232807.

  52. Kaptein, Maurits, and Dean Eckles. “Selecting effective means to any end: Futures and ethics of persuasion profiling.” International conference on persuasive technology. Springer, Berlin, Heidelberg, 2010.

    Google Scholar 

  53. Robinson NL, Cottier TV, Kavanagh DJ. Psychosocial health interventions by social robots: systematic review of randomized controlled trials. J Med Internet Res. 2019;21(5):e13203. https://doi.org/10.2196/13203.

    Article  PubMed  PubMed Central  Google Scholar 

  54. Pu L, Moyle W, Jones C, Todorovic M. The effectiveness of social robots for older adults: a systematic review and meta-analysis of randomized controlled studies. The Gerontologist. 2019;59(1):e37–51. https://doi.org/10.1093/geront/gny046.

    Article  PubMed  Google Scholar 

  55. Moerman CJ, van der Heide L, Heerink M. Social robots to support children’s Well-being under medical treatment: a systematic state-of-the-art review. J Child Health Care. 2019;23(4):596–612. https://doi.org/10.1177/1367493518803031.

    Article  PubMed  Google Scholar 

  56. Clayton RB, Leshner G, Almond A. The extended iSelf: the impact of iPhone separation on cognition, emotion, and physiology. J Comput-Mediat Commun. 2015;20(2):119–35. https://doi.org/10.1111/jcc4.12109.

    Article  Google Scholar 

  57. Warner M. Detour - deceptive experience to online users reduction act. 2019. https://www.congress.gov/116/bills/s1084/BILLS-116s1084is.pdf.

  58. Hawley J. Social media addiction reduction technology (SMART) act. 2019. https://www.congress.gov/116/bills/s2314/BILLS-116s2314is.pdf.

  59. Melton J, Verhulsdonck G, Shah V, Dunn P. Intentional non-use of the internet in a digital world: a textual analysis of disconnection narratives. In: 2019 IEEE international professional communication conference (ProComm). 2019. p. 65–6. https://doi.org/10.1109/ProComm.2019.00016.

  60. Syvertsen T. Digital detox: the politics of disconnecting. Emerald Group Publishing; 2020.

    Book  Google Scholar 

  61. Lacey C, Caudwell C. Cuteness as a ‘dark pattern’ in home robots. 2019. p. 8.

    Google Scholar 

  62. Nash K, Lea JM, Davies T, Yogeeswaran K. The bionic blues: robot rejection lowers self-esteem. Comput Hum Behav. 2018;78:59–63. https://doi.org/10.1016/j.chb.2017.09.018.

    Article  Google Scholar 

  63. Sandoval EB. Addiction to social robots: a research proposal. In: 2019 14th ACM/IEEE international conference on human-robot interaction (HRI). Daegu: IEEE; 2019. p. 526–27. https://doi.org/10.1109/HRI.2019.8673143.

  64. Foundation for Responsible Robotics. Assessment Principles of the FFR Quality Mark (blog). 2020. https://responsiblerobotics.org/quality-mark/assessment-principles/.

  65. Danezis G, Domingo-Ferrer J, Hansen M, Hoepman J-H, Le Metayer D, Tirtea R, Schiffner S. Privacy and data protection by design-from policy to engineering. ArXiv Preprint ArXiv:1501.03726; 2015.

    Google Scholar 

  66. Friedman B, Hendry DG. Value sensitive design: shaping technology with moral imagination. MIT Press; 2019.

    Book  Google Scholar 

  67. The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems. Ethically aligned design, 1st ed. IEEE; 2019.

    Google Scholar 

  68. Berdichevsky D, Neuenschwander E. Toward an ethics of persuasive technology. Commun ACM. 1999;42(5):51–8. https://doi.org/10.1145/301353.301410.

    Article  Google Scholar 

  69. Fogg, Brian J. “Persuasive computers: perspectives and research directions.” Proceedings of the SIGCHI conference on Human factors in computing systems. 1998. https://doi.org/10.1145/274644.274677.

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Shamsudhin, N., Jotterand, F. (2021). Social Robots and Dark Patterns: Where Does Persuasion End and Deception Begin?. In: Jotterand, F., Ienca, M. (eds) Artificial Intelligence in Brain and Mental Health: Philosophical, Ethical & Policy Issues. Advances in Neuroethics. Springer, Cham. https://doi.org/10.1007/978-3-030-74188-4_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-74188-4_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-74187-7

  • Online ISBN: 978-3-030-74188-4

  • eBook Packages: MedicineMedicine (R0)

Publish with us

Policies and ethics