Abd-alrazaq, A. A., Alajlani, M., Alalwan, A. A., Bewick, B. M., Gardner, P., & Househ, M. (2019). An overview of the features of Chatbots in mental Health: A scoping review. International Journal of Medical Informatics, 132, 103978 s.
Abd-Alrazaq, A. A., Alajlani, M., Ali, N., Denecke, K., Bewick, B. M., & Househ, M. (2021). Perceptions and opinions of patients about mental Health Chatbots: Scoping review. Journal of Medical Internet Research, 23(1), e17828.
Ahmad, R., Siemon, D., Fernau, D., & Robra-Bissantz, S. (2020a). “Introducing" Raffi": A Personality Adaptive Conversational Agent.,” in PACIS, p. 28.
Ahmad, R., Siemon, D., & Robra-Bissantz, S. (2020b). ExtraBot vs IntroBot: The Influence of Linguistic Cues on Communication Satisfaction. In B. B. Anderson, J. Thatcher, R. D. Meservy, K. Chudoba, K. J. Fadel, & S. Brown (Eds.), 26th Americas Conference on Information Systems, AMCIS 2020, Virtual Conference, August 15–17, 2020. Association for Information Systems.
Ahmad, R., Siemon, D., Gnewuch, U., & Robra-Bissantz, S. (2021a). The benefits and caveats of personality-adaptive conversational agents in mental Health care. AMCIS.
Ahmad, R., Siemon, D., & Robra-Bissantz, S. (2021b). “Communicating with machines: Conversational agents with personality and the role of extraversion,” in Proceedings of the 54th Hawaii International Conference on System Sciences, p. 4043.
Allport, G. W. (1961). Pattern and growth in personality.
Al-Natour, S., Benbasat, I., & Cenfetelli, R. T. (2005). “The Role of Similarity in E-Commerce Interactions: The Case of Online Shopping Assistants,” SIGHCI 2005 Proceedings, p. 4.
Arnoux, P.-H., Xu, A., Boyette, N., Mahmud, J., Akkiraju, R., & Sinha, V. (2017). “25 Tweets to Know You: A New Model to Predict Personality with Social Media,” in Proceedings of the International AAAI Conference on Web and Social Media (Vol. 11).
Babbie, E. R. (2020). The practice of social research, Cengage learning.
Bae Brandtzæg, P. B., Skjuve, M., Kristoffer Dysthe, K. K., & Følstad, A. (2021). “When the social becomes non-human: Young People’s perception of social support in Chatbots,” in Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1–13.
Baskerville, R., & Pries-Heje, J. (2010). Explanatory design theory. Business & Information Systems Engineering, 2(5), 271–282.
Bendig, E., Erb, B., Schulze-Thuesing, L., & Baumeister, H. (2019). The next generation: Chatbots in clinical psychology and psychotherapy to Foster mental Health – A scoping review (pp. 1–13). Karger Publishers.
Bouchet, F., & Sansonnet, J.-P. (2012). “Intelligent Agents with Personality: From Adjectives to Behavioral Schemes,” in Cognitively Informed Intelligent Interfaces: Systems Design and Development, IGI Global, pp. 177–200.
Boyd, R. L., & Pennebaker, J. W. (2017). Language-based personality: A new approach to personality in a digital world. Current Opinion in Behavioral Sciences, 18, 63–68.
Brendel, A. B., Mirbabaie, M., Lembcke, T.-B., & Hofeditz, L. (2021). Ethical management of artificial intelligence. Sustainability, 13(4), 1974.
Chakrabarti, C., & Luger, G. F. (2015). Artificial conversations for customer service chatter bots: Architecture, algorithms, and evaluation metrics. Expert Systems with Applications, 42(20), 6878–6897.
Chung, C. K., & Pennebaker, J. W. (2012). Linguistic Inquiry and Word Count (LIWC): Pronounced ‘Luke,’... and Other Useful Facts. InApplied Natural Language Processing: Identification, Investigation and Resolution (pp. 206–229).
Clark, L., Pantidi, N., Cooney, O., Doyle, P., Garaialde, D., Edwards, J., Spillane, B., Gilmartin, E., Murad, C., Munteanu, C., Wade, V., & Cowan, B. R. (2019). “What Makes a Good Conversation?: Challenges in Designing Truly Conversational Agents,” in Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow Scotland Uk: ACM, May 2, pp. 1–12.
D’Alfonso, S. (2020). AI in mental Health. Current Opinion in Psychology, 36, 112–117.
Diederich, S., Brendel, A., Morana, S., & Kolbe, L. (2022). On the design of and interaction with conversational agents: an organizing and assessing review of human-computer interaction research. Journal of the Association for Information Systems, 23(1), 96–138.
Feine, J., Gnewuch, U., Morana, S., & Maedche, A. (2019). A taxonomy of social cues for conversational agents. International Journal of Human-Computer Studies, 132, 138–161.
Ferrucci, D. A. (2012). Introduction to ‘This Is Watson. IBM Journal of Research and Development, 56(3.4), 1–1.
Fiske, A., Henningsen, P., & Buyx, A. (2019). Your robot therapist will see you now: Ethical implications of embodied artificial intelligence in psychiatry, psychology, and psychotherapy. Journal of Medical Internet Research, 21(5)). https://doi.org/10.2196/13216
Fogg, B. J. (2002). “Persuasive Technology: Using Computers to Change What We Think and Do,” Ubiquity (2002:December), ACM New York, NY, USA, p. 2.
Gaffney, H., Mansell, W., & Tai, S. (2019). Conversational Agents in the Treatment of Mental Health Problems: Mixed-Method Systematic Review. JMIR Mental Health, 6(10), e14166.
Gnewuch, U., Morana, S., & Maedche, A. (2017). "Towards designing cooperative and social conversational agents for customer service," in Proceedings of the 38th International Conference on Information Systems (ICIS2017).
Gnewuch, U., Yu, M., & Maedche, A. (2020). “The Effect of Perceived Similarity in Dominance on Customer Self-Disclosure to Chatbots in Conversational Commerce,” in Proceedings of the 28th European Conference on Information Systems (ECIS 2020).
Golbeck, J., Robles, C., Edmondson, M., & Turner, K. (2011). “Predicting Personality from Twitter,” in Privacy, Security, Risk and Trust (PASSAT) and 2011 IEEE Third Inernational Conference on Social Computing (SocialCom), 2011 IEEE Third International Conference On, IEEE, pp. 149–156.
Goldberg, L. R. (1993). “The Structure of Phenotypic Personality Traits.,” American Psychologist (48:1), American Psychological Association, p. 26.
Graham, S., Depp, C., Lee, E. E., Nebeker, C., Tu, X., Kim, H.-C., & Jeste, D. V. (2019). Artificial intelligence for mental Health and mental illnesses: An overview. Current Psychiatry Reports, 21(11), 116.
Graham, S. A., Lee, E. E., Jeste, D. V., Van Patten, R., Twamley, E. W., Nebeker, C., Yamada, Y., Kim, H.-C., & Depp, C. A. (2020). Artificial intelligence approaches to predicting and detecting cognitive decline in older adults: A conceptual review. Psychiatry Research, 284, 112732.
Gregor, S., & Hevner, A. R. (2013). Positioning and presenting design science research for maximum impact. MIS Q, 37(2), 337–356.
Gregor, S., Jones, D., & et al. (2007). The Anatomy of a Design Theory, Association for Information Systems.
Gregor, S., Chandra Kruse, L., & Seidel, S. (2020). Research perspectives: The anatomy of a design principle. Journal of the Association for Information Systems, 21(6), 2.
Grudin, J., & Jacques, R. (2019). “Chatbots, Humbots, and the quest for artificial general intelligence,” in Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems - CHI ‘19, Glasgow, Scotland Uk: ACM Press, pp. 1–11.
Grünzig, S.-D., Baumeister, H., Bengel, J., Ebert, D., & Krämer, L. (2018). “Effectiveness and Acceptance of a Web-Based Depression Intervention during Waiting Time for Outpatient Psychotherapy: Study Protocol for a Randomized Controlled Trial,” Trials (19:1), Springer, pp. 1–11.
Hevner, A. R. (2007). A three cycle view of design science research. Scandinavian Journal of Information Systems, 19(2), 4.
Hevner, A. R. (2020). The duality of science: Knowledge in information systems research. Journal of Information Technology, 0268396220945714.
Hevner, A., March, S. T., Park, J., & Ram, S. (2004). Design science research in information systems. MIS Quarterly, 28(1), 75–105.
Iivari, J. (2015). Distinguishing and contrasting two strategies for design science research. European Journal of Information Systems, 24(1), 107–115.
Iivari, J., Hansen, M. R. P., & Haj-Bolouri, A. (2021). A proposal for minimum reusability evaluation of design principles. European Journal of Information Systems, 30(3), 286–303.
Jones, S. P., Patel, V., Saxena, S., Radcliffe, N., Ali Al-Marri, S., & Darzi, A. (2014). How google’s ‘ten things we know to be true’could guide the development of mental health mobile apps. Health Affairs, 33(9), 1603–1611.
Junglas, I. A., Johnson, N. A., & Spitzmüller, C. (2008). Personality traits and concern for privacy: An empirical study in the context of location-based services. European Journal of Information Systems, 17(4), 387–402.
Kampman, O., Siddique, F. B., Yang, Y., & Fung, P. (2019). Adapting a Virtual Agent to User Personality. InAdvanced Social Interaction with Agents (pp. 111–118). Springer.
Kerr, I. R. (2003). Bots, babes and the Californication of commerce. University of Ottawa Law and Technology Journal, 1, 285.
Kim, S. Y., Schmitt, B. H., & Thalmann, N. M. (2019). Eliza in the Uncanny Valley: Anthropomorphizing consumer robots increases their perceived warmth but decreases liking. Marketing Letters, 30(1), 1–12.
Kocaballi, A. B., Berkovsky, S., Quiroz, J. C., Laranjo, L., Tong, H. L., Rezazadegan, D., Briatore, A., & Coiera, E. (2019). The personalization of conversational agents in Health care: Systematic review. Journal of Medical Internet Research, 21(11), e15360.
Kocaballi, A. B., Laranjo, L., Quiroz, J., Rezazadegan, D., Kocielnik, R., Clark, L., Liao, V., Park, S., Moore, R., & Miner, A. (2020). Conversational agents for Health and wellbeing.
Laranjo, L., Dunn, A. G., Tong, H. L., Kocaballi, A. B., Chen, J., Bashir, R., Surian, D., Gallego, B., Magrabi, F., Lau, A. Y. S., & Coiera, E. (2018). Conversational agents in healthcare: A systematic review. Journal of the American Medical Informatics Association, 25(9), 1248–1258.
Liu, K., & Picard, R. W. (2005). “Embedded Empathy in Continuous, Interactive Health Assessment,” in CHI Workshop on HCI Challenges in Health Assessment (Vol. 1), Citeseer, p. 3.
Luxton, D. D. (2014). Recommendations for the ethical use and Design of Artificial Intelligent Care Providers. Artificial Intelligence in Medicine, 62(1), 1–10.
Luxton, D. D. (2020). Ethical implications of conversational agents in global public Health. Bulletin of the World Health Organization, 98(4), 285.
Maedche, A., Gregor, S., Morana, S., & Feine, J. (2019). Conceptualization of the problem space in design science research. InInternational Conference on Design Science Research in Information Systems and Technology (pp. 18–31). Springer.
Mairesse, F., & Walker, M. A. (2010). Towards personality-based user adaptation: Psychologically informed stylistic language generation. User Modeling and User-Adapted Interaction, 20(3), 227–278.
Mairesse, F., Walker, M. A., Mehl, M. R., & Moore, R. K. (2007). Using linguistic cues for the automatic recognition of personality in conversation and text. Journal of Artificial Intelligence Research, 30, 457–500.
Mayring, P. (2014). Qualitative content analysis: Theoretical Foundation, Basic Procedures and Software Solution.
McCrae, R. R., & Costa, P. T., Jr. (1997). Personality trait structure as a human universal. American Psychologist, 52(5), 509.
McCrae, R. R., & John, O. P. (1992). An introduction to the five-factor model and its applications. Journal of Personality, 60(2), 175–215.
McTear, M., Callejas, Z., & Griol, D. (2016). The conversational Interface: Talking to smart devices. Springer.
Möller, F., Guggenberger, T. M., & Otto, B. (2020). Towards a method for design principle development in information systems. InInternational Conference on Design Science Research in Information Systems and Technology (pp. 208–220). Springer.
Moon, Y., & Nass, C. (1996). How ‘real’ are computer personalities? Psychological responses to personality types in human-computer interaction. Communication Research, 23(6), 651–674.
Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81–103.
Nass, C., Steuer, J., Tauber, E., & Reeder, H. (1993). “Anthropomorphism, Agency, and Ethopoeia: Computers as Social Actors,” in INTERACT ‘93 and CHI ‘93 Conference Companion on Human Factors in Computing Systems, CHI ‘93, New York, NY, USA: Association for Computing Machinery, April 1, pp. 111–112. https://doi.org/10.1145/259964.260137.
Nass, C., Steuer, J., & Tauber, E. R. (1994). “Computers are social actors,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, pp. 72–78.
Nass, C., Moon, Y., Fogg, B. J., Reeves, B., & Dryer, D. C. (1995). Can computer personalities be human personalities? International Journal of Human-Computer Studies, 43(2), 223–239.
Natale, S. (2018). If software is narrative: Joseph Weizenbaum, artificial intelligence and the biographies of ELIZA. New Media & Society, 21, 146144481880498.
Nißen, M. K., Selimi, D., Janssen, A., Cardona, D. R., Breitner, M. H., Kowatsch, T., & von Wangenheim, F. (2021). See you soon again, Chatbot? A design taxonomy to characterize user-Chatbot relationships with different time horizons. Computers in Human Behavior, 107043.
Pennebaker, J. W. (2011). The secret life of pronouns: How our words reflect who we are. Bloomsbury.
Pennebaker, J. W., & Francis, M. E. (1996). Cognitive, emotional, and language processes in disclosure. Cognition & Emotion, 10(6) Taylor & Francis, 601–626.
Pennebaker, J. W., Boyd, R. L., Jordan, K., & Blackburn, K. (2015). The development and psychometric properties of LIWC2015.
Pennington, J., Socher, R., & Manning, C. (2014). “Glove: Global vectors for word representation,” in Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543.
Peters, O. (2013). Critics of Digitalisation: Against the Tide: Warners, Sceptics, Scaremongers, Apocalypticists: 20 Portraits, Studien Und Berichte Der Arbeitsstelle Fernstudienforschung Der Carl von Ossietzky Universität Oldenburg, Oldenburg: BIS-Verlag der Carl von Ossietzky Universität Oldenburg.
Porra, J., Lacity, M., & Parks, M. S. (2019). “‘Can computer based human-likeness endanger humanness?’ – A philosophical and ethical perspective on digital assistants expressing feelings they Can’t have””, Information Systems Frontiers.
Prakash, A. V., & Das, S. (2020). Intelligent conversational agents in mental healthcare services: A thematic analysis of user perceptions. Pacific Asia Journal of the Association for Information Systems, 12(2), 1.
Purao, S., Chandra Kruse, L., & Maedche, A. (2020). The origins of design principles: Where do… they all come from?
Ranjbartabar, H., Richards, D., Kutay, C., & Mascarenhas, S. (2018). “Sarah the virtual advisor to reduce study stress,” In Proceedings of the 17th International Conference on Autonomous Agents and MultiAgent Systems, pp. 1829–1831.
Rice, D. R., & Zorn, C. (2021). Corpus-based dictionaries for sentiment analysis of specialized vocabularies. Political Science Research and Methods, 9(1) Cambridge University Press, 20–35.
Rothe, H., Wessel, L., & Barquet, A. P. (2020). Accumulating design knowledge: A mechanisms-based approach. Journal of the Association for Information Systems, 21(3), 1.
Schuetzler, R., Grimes, G., Giboney, J., & Nunamaker, J. (2018). “The influence of conversational agents on socially desirable responding,” Proceedings of the 51st Hawaii International Conference on System Sciences, pp. 283–292.
Shah, H., Warwick, K., Vallverdú, J., & Wu, D. (2016). Can machines talk? Comparison of Eliza with modern dialogue systems. Computers in Human Behavior, 58, 278–295.
Shum, H., He, X., & Li, D. (2018). From Eliza to XiaoIce: Challenges and opportunities with social Chatbots. Frontiers of Information Technology & Electronic Engineering, 19(1), 10–26.
Skjuve, M., Følstad, A., Fostervold, K. I., & Brandtzaeg, P. B. (2021). My Chatbot companion-a study of human-Chatbot relationships. International Journal of Human-Computer Studies, 149), Elsevier, 102601.
Smith, K., Masthoff, J., Tintarev, N., & Wendy, M. (2015). Adapting emotional support to personality for Carers experiencing. Stress. https://doi.org/10.13140/RG.2.1.3898.9929
Stieger, M., Nißen, M., Rüegger, D., Kowatsch, T., Flückiger, C., and Allemand, M. 2018. “PEACH, a smartphone-and conversational agent-based coaching intervention for intentional personality change: Study protocol of a randomized, wait-list controlled trial,” BMC Psychology (6:1), Springer, pp. 1–15.
Ta, V., Griffith, C., Boatfield, C., Wang, X., Civitello, M., Bader, H., DeCero, E., & Loggarakis, A. (2020). User experiences of social support from companion Chatbots in everyday contexts: Thematic analysis. Journal of Medical Internet Research, 22(3), e16235.
Torous, J., Myrick, K. J., Rauseo-Ricupero, N., & Firth, J. (2020). Digital mental health and COVID-19: Using technology today to accelerate the curve on access and quality tomorrow. JMIR Mental Health, 7(3), e18848. https://doi.org/10.2196/18848
Venable, J. (2006). “The Role of Theory and Theorising in Design Science Research,” in Proceedings of the 1st International Conference on Design Science in Information Systems and Technology (DESRIST 2006), Citeseer, pp. 1–18.
Völkel, S. T., Kempf, P., & Hussmann, H. (2020). “Personalised chats with voice assistants: The user perspective,” in Proceedings of the 2nd Conference on Conversational User Interfaces, pp. 1–4.
Völkel, S. T., Meindl, S., & Hussmann, H. (2021). “Manipulating and evaluating levels of personality perceptions of voice assistants through enactment-based dialogue design,” in CUI 2021-3rd Conference on Conversational User Interfaces, pp. 1–12.
Vom Brocke, J., Winter, R., Hevner, A., & Maedche, A. (2020). Special issue editorial–accumulation and evolution of design knowledge in design science research: A journey through time and space. Journal of the Association for Information Systems, 21(3), 9.
Wasil, A. R., Palermo, E., Lorenzo-Luaces, L., & DeRubeis, R. (2021). Is there an app for that? A review of popular mental health and wellness apps, PsyArXiv. https://doi.org/10.31234/osf.io/su4ar.
Weizenbaum, J. (1966). ELIZA—A Computer program for the study of natural language communication between man and machine. Communications of the ACM, 9(1), 36–45.
WHO. (2017). Depression and other common mental disorders: Global Health estimates. World Health Organization.
WHO. (2021). “WHO Executive Board Stresses Need for Improved Response to Mental Health Impact of Public Health Emergencies”. https://www.who.int/news/item/11-02-2021-who-executive-board-stresses-need-for-improved-response-to-mental-health-impact-of-public-health-emergencies. Accessed April 20, 2021.
Wibhowo, C., & Sanjaya, R. (2021). “Virtual assistant to suicide prevention in individuals with borderline personality disorder,” in 2021 International Conference on Computer & Information Sciences (ICCOINS), IEEE, pp. 234–237.
Woebot Health. (2021). “Woebot”. https://woebothealth.com/products-pipeline/. Accessed Feb 28, 2021.
Wysa, io. (2021). “Wysa”. https://www.wysa.io/. Accessed Feb 28, 2021.
X2, A. (2021). “Tess”. https://www.x2ai.com/. Accessed Feb 28, 2021.
Yarkoni, T. (2010). Personality in 100,000 words: A large-scale analysis of personality and word use among bloggers. Journal of Research in Personality, 44(3), 363–373.
Yorita, A., Egerton, S., Oakman, J., Chan, C., & Kubota, N. (2019). “Self-adapting Chatbot personalities for better peer support,” in 2019 IEEE international conference on systems, Man and Cybernetics (SMC), IEEE, pp. 4094–4100.
Zalake, M. (2020). “Advisor: Agent-based intervention leveraging individual differences to support mental wellbeing of college students,” in Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–8.