Abstract
This chapter continues to explore the applied aspects of MHapps and extends the arena to include technologies that use artificial intelligence (AI). Virtual companions (VCs) are AI social chatbot apps and programs produced for a variety of human desires. There are some VCs that have been developed particularly to support mental health, such as Woebot, and other apps that have not been designed solely to use as a MHapp but are advertised as incorporating wellbeing and enhanced mental health as an added benefit. In this chapter, we look at the emergence and potentialities of virtual companions and focus on a widely used example called Replika that is often marketed as an app that is beneficial for mental health. We examine how it has been conceptualized within the literature and draw on some data we have collected to exemplify its use as a MHapp.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Adams, N. N. (2022). ‘Scraping’ Reddit posts for academic research? Addressing some blurred lines of consent in growing internet-based research trend during the time of Covid-19. International Journal of Social Research Methodology, 27(1) 47–62. https://doi.org/10.1080/13645579.2022.2111816
Ahmed, A., Ali, N., Aziz, S., Abd-Alrazaq, A. A., Hassan, A., Khalifa, M., Elhusein, B., Ahmed, M., Ahmed, M. A. S., & Househ, M. (2021). A review of mobile chatbot apps for anxiety and depression and their self-care features. Computer Methods and Programs in Biomedicine Update, 1, 100012.
Beasley, A., & Mason, W. (2015). Emotional states vs emotional words in social media. In Proceedings of ACM web science conference. Association for Computing Machinery, p. Article 31.
Bell, G., & Grey, J. (2000). Digital immortality. Technical Report Microsoft Research. MSR-TR-2000-101.
Bosch, M., Fernandez-Borsot, G., Comas, M. I., & Figa Vaello, J. (2022). Evolving friendship? Essential changes, from social networks to artificial companions. Social Network Analysis and Mining, 12(1), 1–10.
Brandtzaeg, P. B., Skjuve, M., & Følstad, A. (2022). My AI friend: How users of a social chatbot understand their human–AI friendship. Human Communication Research, 48(3), 404–429.
Crawford, K. (2021). Atlas of AI. Yale University Press.
De Nieva, J. O., Joaquin, J. A., Tan, C. B., Marc Te, R. K., & Ong, E. (2020, October). Investigating students’ use of a mental health chatbot to alleviate academic stress. In 6th International ACM In-Cooperation HCI and UX Conference (pp. 1–10).
Ellis, D., & Cromby, J. (2011). Emotional inhibition: A discourse analysis of disclosure. Psychology & Health. http://www.informaworld.com/10.1080/08870446.2011.584623
Ellis, D., & Tucker, I. (2015). Social psychology of emotion. Sage Publications.
Eriksson, T. (2022). Design fiction exploration of romantic interaction with virtual humans in virtual reality. Journal of Future Robot Life, 3(1), 63–75.
Fan, R., Varamesh, A., Varol, O., Barron, A., van de Leemput, I., Scheffer, M., & Bollen, J. (2018). Does putting your emotions into words make you feel better? Measuring the minute-scale dynamics of emotions from online data. arXiv preprint arXiv:1807.09725.
García, E. (2023). Affective atmospheres and the enactive-ecological framework. Philosophical Psychology, 1–26. https://doi.org/10.1080/09515089.2023.2229350
Indrayani, L. M., Amalia, R. M., & Hakim, F. Z. M. (2020). Emotive expressions on social chatbot. Jurnal Sosioteknologi, 18(3), 509–516.
Jiang, Q., Zhang, Y., & Pian, W. (2022). Chatbot as an emergency exist: Mediated empathy for resilience via human-AI interaction during the COVID-19 pandemic. Information Processing & Management, 59(6), 103074. https://doi.org/10.1016/j.ipm.2022.103074
Khadpe, P., Krishna, R., Fei-Fei, L., Hancock, J. T., & Bernstein, M. S. (2020). Conceptual metaphors impact perceptions of human-AI collaboration. Proceedings of the ACM on Human-Computer Interaction, 4(CSCW2), 1–26.
Laestadius, L., Bishop, A., Gonzalez, M., Illenčík, D., & Campos-Castillo, C. (2022). Too human and not human enough: A grounded theory analysis of mental health harms from emotional dependence on the social chatbot Replika. New Media & Society, 14614448221142008.
Maples, B., Cerit, M., Vishwanath, A., & Pea, R. (2024). Loneliness and suicide mitigation for students using GPT3-enabled chatbots. NPJ Mental Health Research, 3(1), 4.
Mayer, J. D., & Salovey, P. (1995). Emotional intelligence and the construction and regulation of feelings. Applied and Preventive Psychology, 4(3), 197–208.
Mensio, M., Rizzo, G., & Morisio, M. (2018). The rise of emotion-aware conversational agents: threats in digital emotions. In Companion Proceedings of the The Web Conference 2018 (pp. 1541–1544). https://doi.org/10.1145/3184558.3191607
Natale, S. (2019). If software is narrative: Joseph Weizenbaum, artificial intelligence and the biographies of ELIZA. New Media & Society, 21(3), 712–728.
Pink, S., & Leder Mackley, K. (2013). Saturated and situated: Expanding the meaning of media in the routines of everyday life. Media, Culture & Society, 35(6), 677–691. https://doi.org/10.1177/0163443713491298
Skjuve, M., Følstad, A., Fostervold, K. I., & Brandtzaeg, P. B. (2021). My chatbot companion—A study of human-chatbot relationships. International Journal of Human-Computer Studies, 149, 102601.
Sulaiman, S., Mansor, M., Wahid, R. A., & Azhar, N. A. A. N. (2022). Anxiety assistance mobile apps chatbot using cognitive behavioural therapy. International Journal of Artificial Intelligence, 9(1), 17–23.
Ta, V., Griffith, C., Boatfield, C., Wang, X., Civitello, M., Bader, H., DeCero, E., & Loggarakis, A. (2020). User experiences of social support from companion chatbots in everyday contexts: Thematic analysis. J Med Internet Res, 22(3), e16235. https://doi.org/10.2196/16235
Trothen, T. J. (2022). Replika: Spiritual enhancement technology? Religions, 13(4), 275.
Weizenbaum, J. (1976). Computer power and human reason: From judgment to calculation. W. H. Freeman & Co.
Weizenbaum, J., & Wendt, G. (2015). Islands in the cyberstream: Seeking havens of reason in a programmed society. Litwin Books.
Wibhowo, C., & Sanjaya, R. (2021). Virtual assistant to suicide prevention in individuals with borderline personality disorder. In 2021 International Conference on Computer & Information Sciences (ICCOINS) (pp. 234–237). IEEE. https://doi.org/10.1109/ICCOINS49721.2021.9497160
Ziemer, K. S., & Korkmaz, G. (2017). Using text to predict psychological and physical health: A comparison of human raters and computerized text analysis. Computers in Human Behaviour, 76, 122–127.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Goodings, L., Ellis, D., Tucker, I. (2024). Mental Health and Virtual Companions: The Example of Replika. In: Understanding Mental Health Apps. Palgrave Studies in Cyberpsychology. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-031-53911-4_3
Download citation
DOI: https://doi.org/10.1007/978-3-031-53911-4_3
Published:
Publisher Name: Palgrave Macmillan, Cham
Print ISBN: 978-3-031-53910-7
Online ISBN: 978-3-031-53911-4
eBook Packages: Behavioral Science and PsychologyBehavioral Science and Psychology (R0)