Abstract
Over the past 15 years, researchers, practitioners, and policy-makers have observed poor program usage of unguided e-mental health interventions in real-world settings. This paper proposes that focusing on the impact of program usage, however, distracts us from the mechanism of change that is linked with the targeted clinical outcomes: the incorporation of therapeutic activities. Uncovering this relationship is particularly important in digital mental health intervention research because the therapeutic activities meant to achieve beneficial outcomes are not always explicit. The paper presents a framework that may help disentangle different conceptual aspects in order to better investigate the relationship between user engagement with the digital program, therapeutic activities, and clinical outcomes. Critically, one of the main challenges involves determining whether the use of the digital program itself is tied to a therapeutic activity. Such a framework may enable a better understanding of how e-mental health interventions work and to better address intervention design failure points.
Avoid common mistakes on your manuscript.
Introduction
User engagement with a digital health intervention is considered to be a necessary condition for an intervention’s success. Clearly, an intervention cannot influence individuals who are not using it. Unfortunately, over the past 15 years, researchers, practitioners, and policy-makers have observed poor user engagement with unguided digital mental health interventions in real-world settings (e.g., Baumel et al., 2019a, b; Christensen et al., 2009; Fleming et al., 2018). As this phenomenon limits the potential of unguided digital interventions to improve public health, attempts have been made to conceptually define and understand user engagement and advance the ways it should be investigated and reported (e.g., Doherty & Doherty, 2018; Lalmas et al., 2014; Perski et al., 2016; Yardley et al., 2016).
Conceptualizing and investigating the relationship between user engagement with the digital program, the performance of therapeutic activities as a result of program use, and improvement in clinical outcomes could help to shed light on how digital health interventions work (e.g., Yardley et al., 2016; Zhang et al., 2019). Uncovering this relationship is particularly important in digital mental health intervention research because the therapeutic activities meant to achieve beneficial outcomes are not always explicit. In other words, in the mental health domain, important therapeutic activities—such as cognitive reframing, self-talk, or exercising gratitude—may occur in one’s mind without any direct behavioral foot print. Critically, one of the main challenges involves determining whether the use of the digital program itself is tied to a therapeutic activity. Lack of clarity around this point could create confusion in how we conceptualize the results of empirical studies in the digital mental health domain and limit the impact of research.
I propose that using an analytic approach to distinguish between different user actions and experiences makes it feasible to conceptually differentiate between user engagement and the incorporation of therapeutic activities. Such a framework enables a better understanding of how digital mental health interventions work and is better able to address design failure points. I begin with a brief review of the connection between therapeutic activities and clinical outcomes, as this provides an important background to the proposed framework. I then present the suggested framework for reporting and an example to illustrate the importance of such a framework when trying to investigate how interventions work.
Therapeutic Activities and Clinical Outcomes
Behavioral and mental health interventions are developed based on a theoretical understanding that the incorporation of targeted therapeutic activities can function as a mechanism of change leading to clinical outcomes (e.g., Abraham & Michie, 2008; Mohr et al., 2013; Ritterband et al., 2009). In cognitive behavioral therapy (CBT) targeting people with generalized anxiety disorder, for example, a developer may lean on a theory that links changes in cognitive and behavioral components to a change in symptoms of anxiety. Therapeutic activities may include cognitive components, such as cognitive restructuring exercises and the incorporation of problem-solving techniques, and behavioral components, such as daily muscle relaxation training and exposure (Khodarahimi & Pole, 2010; Stanley et al., 2014). Users who engage in the required therapeutic activities according to the “prescription” are those who adhere to the treatment plan. Since it is simple to document digital footprints, program usage is easily obtained and has been commonly reported in research on digital health interventions. However, program usage is not necessarily equal to the incorporation of a therapeutic activity. In what follows, I present the user pathway framework for understanding the relationship between program usage, therapeutic activities, and clinical outcomes.
The User Pathway Framework
Program Usage and Experience of Engagement
Figure 1 presents an analytic illustration of the user pathway framework, which begins in program usage and ends in clinical outcomes. The literature suggests that the first two components—program usage and experience of engagement with the digital program—can be viewed as two indicators of user engagement (e.g., Perski et al., 2016, 2020). Program usage refers to behavioral measures that describe when the program is used and how often. Measures of usage depend on the delivery medium (e.g., mobile app, website) and include metrics such as the number of program logins, distinct days in which the program was accessed, longevity (number of days between the first and last login), screen time, user retention, and number of modules completed (e.g., Baumel & Kane, 2018; Baumel et al., 2019b; Miller et al., 2019; Rahman et al., 2017). There is no single concrete definition for engagement as an experience and how it can be precisely measured because engagement depends on the clinical target and the way the intervention is designed. A common ground, however, is the idea that the experience of engagement is related to the quality of attention, involvement, and immersion during program usage (for a review see (Glas & Pelachaud, 2015; Perski et al., 2016)).
Minimal program usage is a necessary condition for a user to experience engagement because one cannot be involved or pay attention to a program if it is not being used. However, details regarding users’ usage do not necessarily suffice to inform us about the quality of their engagement with the program. For example, one user could view educational materials without really concentrating on reading them, whereas another user could be highly concentrated and involved. These two users’ ability to remember what they learned and act based on that learning experience would be completely different.
Nonetheless, a more sensitive examination of program usage patterns could reveal important details that shed light on the quality of engagement; in other words, researchers need to have access to a dataset that describes usage in detail. For example, in their novel study, Zhang and colleagues categorized user behaviors to identify clinically meaningful use of mental health apps (Zhang et al., 2019). This work required the investigation of system use data that included user activities such as changing reminders, creating mantras, and scheduling of therapeutic activities. These recorded usage events capture far more nuanced user attention and involvement than the number of user logins. Most importantly, the quality of user engagement can be directly documented based on user responses to questions, success in embedded exercises, and any other event that requires a certain quality of engagement for successful completion. User engagement with educational materials, for example, could be measured if users are not passive learners and must demonstrate their knowledge gain in exercises during completion of the learning phase.
User Engagement and Therapeutic Activities
In digital mental health interventions, a “high quality of engagement” does not necessarily mean that the user engages in the therapeutic activity that is directly related to the targeted mechanism of change. Therefore, investigators should strive to reveal how engagement translates into a therapeutic activity. Yardley and colleagues suggested differentiating between engagement with the program and engagement with behavior change -- while these types of engagement are intimately linked, they are not the same (Yardley et al., 2016). As an illustration, consider an app that promotes behavioral activation for users with depression. In behavioral activation, the mechanism of change involves an increase in positive activities that expose the individual to reinforcing environmental contingencies, which in turn produce beneficial changes in thoughts and mood (Dahne et al., 2017; Hopko et al., 2003). Consequently, when a user pays appropriate attention to the psychoeducational materials within the digital app, which explain the importance of increasing physical activity levels, this learning event is not the same as the targeted therapeutic activity, which is the physical activity itself. In this case, the intervention developers have assumed that user engagement is necessary for the user to eventually conduct this targeted activity. However, if the user does not eventually conduct the desired activity, we cannot expect an improvement in his/her symptoms of depression; moreover, we cannot expect the documented metrics of engagement with the digital program to be linked to clinical outcomes.
In other words, an effective intervention creates a link between user engagement and adherence to therapeutic activities that foster the mechanism of change. This notion has conceptual overlap with Yardley and colleagues’ notion of “effective user engagement,” which is defined as sufficient engagement with the program to achieve the desired outcomes (Yardley et al., 2016). Similarly, the investigator should ask what kind of engagement with the digital program is needed for the successful incorporation of therapeutic activities.
It is important to note that, in some cases, user engagement is equal to the performance of a therapeutic activity, for example, when users appropriately engage with a mindfulness routine to decrease anxiety or stress—assuming that the mindfulness exercise by itself is directly related to the desired clinical outcome (Pascoe et al., 2017)—or when users engage in a daily cognitive restructuring exercise incorporated within an app as part of CBT for anxiety (Neal-Barnett et al., 2019). These instances do not diminish the importance of the framework, but rather stress the importance of carefully understanding each aspect of the pathway. In such cases, the investigator is required to differentiate between user engagement in parts of the program that are not equal to the targeted therapeutic activity and engagement in parts of the program that are equal to such an activity.
Subsequently, it is important to be able to conceptually distinguish between (1) digital intervention components that are used to promote the targeted therapeutic activities and (2) digital intervention components that can be considered equal to the incorporation of the key targeted therapeutic activity. This is because these two types of components are not supposed to invoke beneficial outcomes in the same way. This distinction can also be helpful in creating a language that is similar to that used in different yet related fields. Using a simple example, a psychiatrist will likely differentiate between a patient’s attendance at meetings (usage), attention and involvement (experience of engagement), and adherence to a treatment plan, which is evaluated based on whether the patient actually takes his/her medications according to the prescription (targeted therapeutic activity). Clearly, the psychiatrist will not expect beneficial clinical outcomes in the absence of the targeted therapeutic activity.
A Clarifying Example: Implementing the User Pathway Framework to Illustrate the Moderating Impact of Product Design on Clinical Outcomes
Implementing the user pathway framework enables researchers to more precisely investigate how different phenomena in digital health interventions relate to different metrics and targeted outcomes. Figure 2 makes use of this framework to present a hypothetical model illustrating the moderating impact of product design on clinical outcomes, and more in particular how changes in design are expected to impact different aspects along the pathway. The two quality domains incorporated into this hypothetical model were identified and constructed through the Enlight quality assessment tool (Baumel et al., 2017a, b). These domains are based on a comprehensive systematic review that was performed to identify relevant quality criteria related to eHealth and mHealth products, emerging from 84 identified sources of information (for a review, see (Baumel et al., 2017a, b)).
The systematic review showed that aspects of product design impact the extent to which users want to increasingly utilize a program and become more involved with it (user engagement features). We expect that programs that are of high quality in these aspects will yield higher program usage and better user experience through involvement and attention. In the hypothetical model, however, the link to clinical outcomes rests on whether the user’s increase in engagement translates into engagement in a therapeutic activity. Consider a user of an online CBT program targeting social anxiety disorder (SAD) who receives psychoeducation on the importance of exposure and how it could be exercised. A program that presents the user with captivating, interactive, and well-targeted materials is expected to increase the user’s involvement and attention during this learning experience. This is an important step in the pathway toward incorporating exposure as a therapeutic activity, but it does not directly provoke an exposure event.
Going back to the hypothetical model, the literature points to other aspects of product design that are meant to directly provoke positive changes in users’ lives by embedding a persuasive system design focused on incorporating behavior change techniques (therapeutic persuasivness features) (Baumel et al., 2017a, b). We expect that a CBT program for SAD that provides users with calls to action, such as triggers embedded in people’s lives (e.g., setting relevant exposure goals, reminding and inspiring users to utilize the exposure technique), load reduction of therapeutic activities (e.g., providing exposure opportunities), and a real data-driven approach (e.g., the user’s therapeutic activities impact the next steps) will increase the chances of highly engaged users feeling more accountable and incorporating beneficial therapeutic activities into their lives. From the perspective of behavior change theories, the absence of any of these features cannot ideally support users’ self-management of desired and undesired behaviors (Abraham & Michie, 2008; Doshi et al., 2003; Michie et al., 2008). Accordingly, addressing different aspects of the pathway will enable researchers and developers to more precisely consider the importance of product design in reaching the desired outcomes.
Discussion
The proposed user pathway framework presents a simple approach to improve our understanding of how digital mental health interventions work. Addressing the differentiation in key aspects along the pathway is helpful for identifying the impact of programs’ manipulations and failure points. When the correlation between program usage and clinical outcomes is not high, we must ask: (1) What are users engaged with? Are there any measures that might reveal the quality of user attention and involvement in the program? (2) What therapeutic activities do users conduct? Are these directly/indirectly related with user engagement? (3) Are the therapeutic activities related to clinical outcomes?
As research on digital mental health interventions progresses, scholars are required to unveil potential black boxes. Special attention should be given to the documentation of targeted therapeutic activities during program use, as these activities are conceptually linked to the achievement of desired clinical outcomes. Dismantling studies or factorial study designs that document and examine such therapeutic activities could help moving the field forward by enabling to better understand mechanisms of change associated with clinical outcomes under the different conditions participants were randomized to.
It is worth noting that some issues related to the incorporation of the user pathway framework need to be resolved. First, in some programs, the incorporation of therapeutic activities is already embedded and automatically obtained using passive sensing—for example, in a digital behavioral activation intervention promoting physical activity that adapts based on the levels of the targeted activity (e.g., step counts). In other programs, especially within the mental health domain, therapeutic activities are not always explicit or automatically documented. Second, developers sometimes lean on a sequence of events meant to provoke one therapeutic activity (e.g., psychoeducation → goal setting → activity), which complicates the analytical investigation.
Nevertheless, expanding the framework of our reporting by disentangling different conceptual aspects could be key in moving the translational field of digital mental health interventions forward. The user pathway framework is one example of such an effort. I hope that it will help to expand reporting and subsequently improve research on the relationship between user engagement and clinical outcomes.
References
Abraham, C., & Michie, S. (2008). A taxonomy of behavior change techniques used in interventions. Health Psychology, 27(3), 379–387.
Baumel, A., & Kane, J. M. (2018). Examining predictors of real-world user engagement with self-guided eHealth interventions: Analysis of mobile apps and websites using a novel dataset [Original Paper]. Journal of Medical Internet Research, 20(12), e11491. https://doi.org/10.2196/11491
Baumel, A., Birnbaum, M. L., & Sucala, M. (2017a). A systematic review and taxonomy of published quality criteria related to the evaluation of user-facing eHealth programs. Journal of Medical Systems, 41(8), 128.
Baumel, A., Edan, S., & Kane, J. M. (2019a). Is there a trial bias impacting user engagement with unguided e-mental health interventions? A systematic comparison of published reports and real-world usage of the same programs. Translational Behavioral Medicine, 9(6), 1020–1033.
Baumel, A., Faber, K., Mathur, N., Kane, J. M., & Muench, F. (2017b). Enlight: A comprehensive quality and therapeutic potential evaluation tool for mobile and web-based eHealth interventions [Original Paper]. Journal of Medical Internet Research, 19(3), e82. https://doi.org/10.2196/jmir.7270
Baumel, A., Muench, F., Edan, S., & Kane, J. M. (2019b). Objective user engagement with mental health apps: Systematic search and panel-based usage analysis. Journal of Medical Internet Research, 21(9), e14567.
Christensen, H., Griffiths, K. M., & Farrer, L. (2009). Adherence in internet interventions for anxiety and depression: systematic review. Journal of Medical Internet Research, 11(2), e13.
Dahne, J., Lejuez, C. W., Kustanowitz, J., Felton, J. W., Diaz, V. A., Player, M. S., & Carpenter, M. J. (2017). Moodivate: A self-help behavioral activation mobile app for utilization in primary care—Development and clinical considerations. The International Journal of Psychiatry in Medicine, 52(2), 160–175.
Doherty, K., & Doherty, G. (2018). Engagement in HCI: Conception, theory and measurement. ACM Computing Surveys (CSUR), 51(5), 1–39.
Doshi, A., Patrick, K., Sallis, J. F., & Calfas, K. (2003). Evaluation of physical activity web sites for use of behavior change theories. Annals of Behavioral Medicine, 25(2), 105–111.
Fleming, T., Bavin, L., Lucassen, M., Stasiak, K., Hopkins, S., & Merry, S. (2018). Beyond the trial: Systematic review of real-world uptake and engagement with digital self-help interventions for depression, low mood, or anxiety. Journal of Medical Internet Research, 20(6), e199.
Glas, N., & Pelachaud, C. (2015). Definitions of engagement in human-agent interaction. 2015 International Conference on Affective Computing and Intelligent Interaction (ACII), Xi’an, China.
Hopko, D. R., Lejuez, C., Ruggiero, K. J., & Eifert, G. H. (2003). Contemporary behavioral activation treatments for depression: Procedures, principles, and progress. Clinical Psychology Review, 23(5), 699–717.
Khodarahimi, S., & Pole, N. (2010). Cognitive behavior therapy and worry reduction in an outpatient with generalized anxiety disorder. Clinical Case Studies, 9(1), 53–62.
Lalmas, M., O’Brien, H., & Yom-Tov, E. (2014). Measuring user engagement. Synthesis Lectures on Information Concepts, Retrieval, and Services, 6(4), 1–132.
Michie, S., Johnston, M., Francis, J., Hardeman, W., & Eccles, M. (2008). From theory to intervention: Mapping theoretically derived behavioural determinants to behaviour change techniques. Applied Psychology, 57(4), 660–680.
Miller, S., Ainsworth, B., Yardley, L., Milton, A., Weal, M., Smith, P., & Morrison, L. (2019). A framework for Analyzing and Measuring Usage and Engagement Data (AMUsED) in digital interventions. Journal of Medical Internet Research, 21(2), e10966.
Mohr, D. C., Burns, M. N., Schueller, S. M., Clarke, G., & Klinkman, M. (2013). Behavioral intervention technologies: Evidence review and recommendations for future research in mental health. General Hospital Psychiatry, 35(4), 332–338.
Neal-Barnett, A., Stadulis, R., Ellzey, D., Jean, E., Rowell, T., Somerville, K., & Hogue, M. (2019). Evaluation of the effectiveness of a musical cognitive restructuring app for black inner-city girls: Survey, usage, and focus group evaluation. JMIR mHealth and uHealth, 7(6), e11310.
Pascoe, M. C., Thompson, D. R., Jenkins, Z. M., & Ski, C. F. (2017). Mindfulness mediates the physiological markers of stress: Systematic review and meta-analysis. Journal of Psychiatric Research, 95, 156–178.
Perski, O., Blandford, A., Garnett, C., Crane, D., West, R., & Michie, S. (2020). A self-report measure of engagement with digital behavior change interventions (DBCIs): Development and psychometric evaluation of the “DBCI Engagement Scale.” Translational Behavioral Medicine, 10(1), 267–277.
Perski, O., Blandford, A., West, R., & Michie, S. (2016). Conceptualising engagement with digital behaviour change interventions: A systematic review using principles from critical interpretive synthesis. Translational Behavioral Medicine, 7(2), 254–267.
Rahman, Q. A., Janmohamed, T., Pirbaglou, M., Ritvo, P., Heffernan, J. M., Clarke, H., & Katz, J. (2017). Patterns of user engagement with the mobile app, Manage My Pain: Results of a data mining investigation. JMIR mHealth and uHealth, 5(7), e96.
Ritterband, L. M., Thorndike, F. P., Cox, D. J., Kovatchev, B. P., & Gonder-Frederick, L. A. (2009). A behavior change model for internet interventions. Annals of Behavioral Medicine, 38(1), 18–27. https://doi.org/10.1007/s12160-009-9133-4
Stanley, M. A., Wilson, N. L., Amspoker, A. B., Kraus-Schuman, C., Wagener, P. D., Calleo, J. S., & Williams, S. (2014). Lay providers can deliver effective cognitive behavior therapy for older adults with generalized anxiety disorder: A randomized trial. Depression and Anxiety, 31(5), 391–401.
Yardley, L., Spring, B. J., Riper, H., Morrison, L. G., Crane, D. H., Curtis, K., & Blandford, A. (2016). Understanding and promoting effective engagement with digital behavior change interventions. American Journal of Preventive Medicine, 51(5), 833–842. https://doi.org/10.1016/j.amepre.2016.06.015
Zhang, R., Nicholas, J., Knapp, A. A., Graham, A. K., Gray, E., Kwasny, M. J., & Mohr, D. C. (2019). Clinically Meaningful Use of Mental Health Apps and its Effects on Depression: Mixed Methods Study. Journal of Medical Internet Research, 21(12), e15644.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Disclosure of Potential Research involving Human Participants
This work does not involve human participants.
Statement on the Welfare of Animals
This article does not contain any studies with animals performed by any of the authors.
Conflict of Interest
Dr. Baumel received consulting fees from Pro-Change Behavior Systems, Inc. The author declares no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Baumel, A. Therapeutic Activities as a Link between Program Usage and Clinical Outcomes in Digital Mental Health Interventions: a Proposed Research Framework. J. technol. behav. sci. 7, 234–239 (2022). https://doi.org/10.1007/s41347-022-00245-7
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s41347-022-00245-7