In this context, we introduce a framework for smartphone app evaluation, developed by the American Psychiatric Association (APA) [19], as a tool to help psychiatric educators guide informed decision-making around smartphone apps for use by medical students and residents.
The APA evaluation framework offers a scaffold to guide informed decision-making around apps in a step-by-step process. While it does not produce a binary yes-or-no answer about whether to recommend or use any app, it brings attention to salient and teachable points. The framework is a four-level stage-and-gate model, presented below in Fig. 1, which first considers privacy and safety, followed by evidence, then engagement, and finally, clinical data sharing. While each stage is outlined in further detail on the APA website [19], a brief description as relevant to psychiatric educators is outlined below.
A first stage when evaluating any digital tool like an app is to ensure it will not cause harm. As discussed above in the FDA section, many psychiatric apps may fall outside of federal privacy laws like HIPAA, and so do not guarantee patients’ psychiatric information will be kept private [20]. Checking for the presence of a privacy policy, understanding the privacy policy, and understanding any declared use of patient data are critical. A recent review of apps for use in dementia patients found that the majority of apps offered no safeguards for patient data, and the vast majority sell and market any collected patient data [21]. Psychiatry educators also must be aware that patients may also not be aware of the privacy risks of apps [22, 23] and trainees should balance respect for patient autonomy with beneficence when discussing apps that may have questionable privacy practices. Both educators and trainees can serve as advocates for regulations to increase patient protections [24]. Thus, guiding trainees to consider the implication of privacy policies is always a useful recommendation. While difficult to validate, also checking that the app at least claims to keep patient data secure with protections like passwords, encryption, and secure storage is important. If an app does not explicitly claim to offer security features for patient data, it is likely that the data is not fully secure.
The potential harm from apps is more numerous than may be immediately apparent. Categorizing harm in physical, psychosocial, financial, and privacy/legal domains offers a practical means to consider potential risks. Like any new tools, there can be unintended consequences with smartphone apps used for psychiatric care. For example, one study investigated how an alcohol tracking app could help college student reduce risky drinking, but found that male students actually used the app as a game to see who could drink more [25]. While there have been no malpractice cases to date regarding incorrect recommendations from a psychiatric app, the lack of best practices in many apps advertised for suicide prevention is concerning, especially given the high stakes involved in providing appropriate care to suicidal patients [26].
Understanding harms from apps remains an evolving topic. Potential psychosocial harms from apps are a largely unexplored area, although research evidence shows that some participants drop out of studies as they may find app use stressful or annoying. Just as poorly conducted in-person therapies can cause harm to vulnerable or traumatized patients, untested and unvalidated app based interventions also pose risk. Financial risks associated with app use may include inadvertent disclosure of mental health information that could legally be used by insurance companies or even employers. Finally, privacy risks may include malicious disclosure of mental health information, resulting from hacks or data breaches. Considering that apps can record not only medications and self-reported symptoms, but also geolocation data on where patients live and go during the day, who they call and text, and log information of personal contacts, social medial profiles, and internet browsing histories, the consequences of such a privacy breach are enormous. These consequences are only compounded for patients experiencing domestic violence, stalking, and abuse. While apps may not have known biological risks of medications—such as gastrointestinal side effects—they present unique psychological and social risks that are important to be aware of and raise with trainees.
If an app appears to respect privacy and be safe, it is next worth considering what the evidence supporting use is. Many apps may not present psychiatric knowledge or facts appropriately [26, 27] and some may present exaggerated claims regarding their use. Psychiatry trainees, given their evolving knowledge base, may not always be able to separate which claims are evidence based. Beyond false claims, some apps may also offer harmful recommendations: one app, purportedly designed for those in a bipolar manic episode, instructed users to drink alcohol [27]. Thus, discussing with trainees the evidence supporting app use and checking that the content appears at least of reasonable value are important teaching moments. Often, a simple PubMed search can be very revealing about whether an app is backed by clinical evidence.
If an app appears to have some evidence to justify use, it is useful to next consider engagement. Just as psychiatric educators help trainees learn to formulate treatment plans that patients will stick with, it is no different with digital technologies like apps. Evidence suggests that most who download a mental health app may not stick with that app for more than two or three uses, and many patients may struggle to navigate how to actually use apps in daily life [28]. Thus, considering engagement and a plan to ensure app use is matched to a patient’s interest and technology skillset are critical. A final stage to consider is how the data or results of app use will be shared with the clinical team, impact the therapeutic relationship, and be utilized as part of the treatment plan.
While the EHR vendor Epic recently announced the release of its own marketplace for smartphone apps that can send data directly into Epic EHR’s, many apps can make patient data difficult—and sometimes even impossible—to access and share. Many apps send collected clinical data to their own proprietary portal, making it inconvenient to access and risking fragmenting the patient’s psychiatric information. Integrating patient-generated health data into existing clinical workflows requires thoughtful aggregation of raw data into clinically meaningful and actionable information, and feeding this back to providers through effective tools for data visualization and manipulation [29]. Changes to clinical workflows may be necessary to incorporate mobile health data into practice, and this may require training for providers and staff as well as support for practice redesign. This represents a final stage in this four-stage framework, as discussion of these points only makes sense in the context of an app that should potentially be used in care: one that is safe, supported by evidence, and usable by the patient.
The above framework offers a flexible tool to engage trainees in discussions whether or not to use a specific app or technology, although it is not intended to produce definitive answers regarding app or technology use. Rather, by offering a structure for a conversation, important teaching points will be raised that will help guide informed decision-making. These conversations may be raised during individual supervision, offered during didactics, or occur in the clinic. It is not necessary for educators to have all the answers about any particular app, but instead, simply to recognize that if information about an app is unclear or missing, then that is a factor itself to consider in determining a recommendation. More well-known issues regarding boundaries with electronic communication with patients (for example, on social media sites) are also relevant as the same device a resident is now using to put in orders, page a colleague, and pull up clinical note may also be used to browse Facebook.