Overview of Existing Intelligent Applications

Artificial intelligence (AI), commonly defined as “the development of computer systems able to perform tasks that normally require human intelligence” [1], has become increasingly prevalent in modern medicine and in the field of psychiatry. AI is based upon a wide variety of computer algorithms classified under machine learning (ML). Some examples include random forests, support vector machines, linear discriminant analysis, and natural language processing [2]. Beginning in the 1960s, a computer program known as ELIZA was developed to emulate the conversational abilities of a psychotherapist. The idea was for the machine to simulate human conversation while allowing the patient to do most of the cognitive work of interpretation. The program was intended only for research purposes involving natural language processing experiments, and ultimately led to the rapid expansion of the conversation on artificial intelligence [3]. In 1971, another computer model was designed to simulate paranoia in the setting of a diagnostic psychiatric interview. The model was an attempt to characterize the inner structure of paranoid behavior often encountered by clinicians interviewing paranoid patients [4].

Within the last two decades, AI began to incorporate neuroimaging studies of psychiatric patients with deep learning models to classify patients with psychiatric disorders. For example, Kim et al. was able to classify schizophrenia patients and controls with an accuracy of 85.5% by extracting functional connectivity patterns from resting-state functional MRIs of schizophrenia patients and healthy controls [5]. These findings suggested that deep learning has the ability to classify psychosis in patients using neuroanatomical and neurofunctional information.

Other targets of AI encompass digital gaming interventions and smartphone applications. Digital gaming was initially used to track symptoms and for psychoeducation but has now evolved into complete interventional programs. Gaming modalities are now addressing psychosocial and cognitive domains focusing on specific deficits in various psychiatric disorders. Services provided may include cognitive behavioral therapy, behavioral modification, social motivation, attention enhancement, and biofeedback [6]. Games continue to have widespread appeal and can also be utilized via smartphones. Smartphone applications have become another method utilizing AI, such as projects including mindLAMP (Learn, Assess, Manage, Prevent) and BiAffect. MindLAMP is an application that uses smartphones and embedded sensors to understand people’s experiences of mental illness and helps predict recovery through the collection of surveys, cognitive tests, GPS coordinates, and exercise information. BiAffect uses machine learning algorithms and keyboard metadata such as variability in typing dynamics, errors, and pauses in user messaging to predict manic and depressive episodes in people with bipolar disorder [7].

AI applications appear to have great potential for transforming the delivery of psychiatric care and have already been utilized to assist with making psychiatric diagnoses, symptom tracking, prediction of acute disease exacerbations and recovery, and psychoeducation. In the era of the COVID-19 pandemic, another form of AI technology has gained momentum to offer digital help for psychiatric disorders: chatbots.

Chat/Therapy Bots

Mental illness represents a large burden for individuals, communities, and nations. The emergence of COVID-19 presented doctors with an unprecedented challenge: how to increase access to care during a pandemic. Therapeutic devices that work over SMS text messaging or other messaging devices are currently being explored as a way to address psychiatric symptoms exacerbated by the unrelenting global health crisis and to help those with an existing mental health condition.

Woebot is an automated conversational application available through Facebook Messenger or mobile apps that provides tools that automate the process of cognitive behavioral therapy (CBT). This tool was developed to monitor symptoms and manage episodes of anxiety and depression through learned skills such as identifying and challenging cognitive distortions [8]. According to a randomized controlled trial, 70 subjects were randomized into Woebot and an e-book reading for depression. The Woebot group reported a significant decrease in depression compared to the e-book group [9]. Since chatbots are conversational and keep users engaged, higher levels of engagement might explain the significantly better outcomes and why it draws more attention from financial sponsors [10]. Tess is another program available as a phone number that utilizes text messaging to coach individuals through times of emotional distress. This tool enables the user to have similar therapeutic conversations as though they were conversing with a psychologist and delivers emotional wellness coping strategies [11].

In a similar approach, new forms of avatar therapy have been developed to provide therapeutic conversations with its users. Replika is a smartphone application that allows users to have conversations about themselves, allowing users to gain a better understanding of the good qualities within themselves. Replika reconstructs a footprint of your personality out of digital remains or text conversations you have with your avatar. One of the strongest draws of Replika is that the user can have vulnerable conversations with their avatar without fear of judgement throughout the interaction. Similar to therapy sessions with a psychiatrist or personal conversations with a trusted friend, the avatar can have therapeutic conversations with the user and help the user gain insight into their own personality [12]. Another use of avatars is with Avatar Therapy where computer-generated images of faces interact with patients with schizophrenia via intelligent algorithms. Patients undergo six ten-minute sessions of Avatar Therapy where they challenge the persecutory voice hallucinations they experience and gradually learn to gain control over the distressing voices. Initial studies have shown that Avatar Therapy decreases the amount of distress patients feel in relation to their voices, frequency of hearing voices, and the extent to which they feel overwhelmed by them [13].

In addition to AI designed to replicate human processes, clinicians and scientists have explored the concept of using intelligent animal-like robots to improve psychiatric outcomes such as reducing stress, loneliness and agitation, and improving mood. Companion bots such as Paro, a robotic seal, and eBear, an expressive bear-like robot, interact with patients and provide the benefits of animal therapy. Paro has already been used to help patients with dementia who may be isolated or experiencing feelings of depression. AI-enabled robots have also been studied to help children with autism spectrum disorders (ASDs) through education and therapy. Robots such as Kaspar and Nao are able to teach children social skills and help them with facial recognition and appropriate gaze response, with initial studies reporting that children with ASDs performed better with robotic intervention compared to human therapists [8]. Another application that has made an impact on these individuals is Apple’s virtual assistant Siri, which can engage with children who have ASDs and addresses the hyper-focus on specific interests that can come with the disorder. Humans may not have the desire or the patience to engage with children in the minutiae that they are focusing on, but Siri has the ability to do so. Through engagements like this, provided by AI assistants such as Siri, children can develop the skills necessary to socially interact with others without negative recourse for the social faux pas that inevitably occur. Siri can be of great help by providing the child a safe learning environment and the patience necessary to practice these skills [14].

Discussion

In summary, the future of AI in psychiatry appears to have great potential with growing need and utilization of AI bots in managing psychiatric symptoms and augmenting therapeutic treatments. Mental illness continues to be a heavy burden for society at large. AI-based interventions such as those discussed in this article could provide some relief of that burden especially in an era with a shortage of mental health providers.

Innovations such as chatbots, avatar therapy, and companion bots have many advantages with their utilization such as a reduction in the stigma associated with sharing symptoms of mental illness with physicians, increasing personal comfort with self-disclosure, cost effectiveness, and broadening accessibility. On the other hand, AI bots are not endowed with the varied skill set of that of a trained psychiatrist or therapist, are limited in their ability to apply personal patient details to assist with the cognitive work required of the patient, and may not have the nuanced emotional awareness and empathetic response of a human counterpart. Currently, there are no direct applications available in the clinical setting and no national standard for comparison during technological development [7].

Another major concern is with regard to legal responsibilities; it begs the question, who would be responsible if the bot makes an incorrect diagnosis or incorrectly interprets distress from a patient [15]. Additional concerns surround the safety of protected health information (PHI) during exchange of the information formulated under the Health Insurance Portability and Accountability Act (HIPAA). Fortunately, cloud-based web services such as Amazon Web Service (AWS) have recently started to offer “HIPAA- compliant” connections [16].

Despite the advances in this field of technology, newer tools have not been immediately embraced by mental health providers. Some psychiatrists who strongly value interpersonal interactions with patients may be slow in adopting these new methods indicating a slow diffusion-innovation process in the mental health community [17]. Furthermore, mental health applications are fast growing in adoption which makes risk assessments more challenging and perhaps more likely to occur after harm has already been done. Future work should be directed towards investigating efficacy of AI-based interventions in large, controlled trials and methods for incorporating AI into clinical practice.