Wysa: A promising AI-enabled chatbot for expanding mental health services
By Margaret Saville
Bachelor of Arts Student in Psychology, McGill University, Canada | December 2023
Reviewed by Alexandre Lemyre, Ph.D.
Given that most individuals facing mental health challenges worldwide do not have access to professional care, mental health apps have been identified as a promising, cost-effective, and scalable solution for bridging the treatment gap. While the availability of mental health apps is on the rise, they should be treated with caution as only a handful have undergone rigorous testing or are evidence-based. Among the evidence-based apps that do exist, some, like Wysa, contain chatbots.
Boasting an app store rating of 4.8 stars, conversational agent Wysa has received recent attention in scientific literature as a promising tool for scaling up the delivery of mental health services. Wysa is a publicly available mobile app with a free version and a paid premium version. According to Beatty and colleagues (2022), Wysa provides users with personalized empathetic support, a space to express themselves freely, as well as tools based on cognitive and behavioral therapy (CBT). These CBT-based tools include techniques such as behavioral activation and cognitive restructuring. According to Inkster and colleagues (2018), Wysa employs these techniques grounded in CBT to support the management of anxiety, conflict, sleep, focus, grief, worries, and other problems. Users also have access to several clinical interventions, including talk therapy, education on self-care practices, mindfulness techniques, relaxation strategies, and gratitude journaling.
The internal workings of Wysa have been described in three studies, one published in 2018, one in 2021, and one in 2022. Using a text-based conversational interface, Wysa first prompts information about the user’s emotions and well-being, and in turn the user can choose to converse about anything. This free-text conversational agent can handle complex user input, as the AI models created by in-house clinicians have a broad understanding of a wide array of emotions. The responses are not AI-generated; instead, Wysa utilizes interventive techniques created by the internal team. All content and techniques are approved by the Wysa scientific advisory board. Through this AI-guided listening and support, the Wysa app helps users develop positive self-expression and mental resilience.
In their observational study, Inkster and colleagues (2018) examined the effectiveness of Wysa for individuals with self-reported depressive symptoms. Relative to low-engagement users, the high-engagement users reported significant improvements in their depressive symptoms after using Wysa daily for two weeks. It should be acknowledged, however, that the study lacked both randomization and a control group unexposed to the intervention. Additionally, the sample sizes for low engagement and high engagement users were relatively small. It is possible, therefore, that low engagement resulted from low effectiveness (which may have caused a lack of motivation), rather than the other way around.
Furthermore, in their analysis of user feedback on the Google Play Store, Malik and colleagues (2021) found the app to be overwhelmingly positively reviewed. The app was deemed acceptable, usable, and useful, with reviews commending its engaging exercises, conversational ability, non-judgementality, ease of conversation and access, and effectiveness in improving mental health.
Others have observed that Wysa’s responses lack fluidity, which can impact the quality of the conversation. In one study, Legaspi Jr and colleagues (2022) evaluated the extent to which Wysa can influence the well-being of students. In their evaluation of Wysa, they examined six user-rated criteria, including content of responses, affect qualities, human-likeness, openness, helpfulness, and user satisfaction. Among these criteria, human-likeness received the lowest average rating by a large margin. The rigid conversation flow appears to pose a challenge in communication with Wysa, occasionally leaving users feeling neglected when the chatbot fails to properly acknowledge their input.
On the other hand, in their analysis of user conversations with Wysa, Beatty and colleagues (2022) discovered elements of bonding, such as gratitude, positive impact, and attributions of human-like qualities to the chatbot. These personifying attributes involved the endowment of human qualities such as being caring, helpful, and non-judgmental. According to these researchers, therapeutic alliance scores improved over time and were even comparable to ratings from studies on alliance in human-delivered face-to-face psychotherapy. Defined as a therapist-client partnership built on trust and collaboration towards the client’s goals, the therapeutic alliance is one of the most powerful mechanisms of change in psychotherapy.
Chatbots like Wysa hold potential to mitigate the significant challenges associated with traditional face-to-face mental health care, including issues of accessibility, stigma, and cost. Nevertheless, these resources should serve only as supplementary or intermediate support rather than a complete substitute for mental health professionals, especially for individuals grappling with severe mental health issues. In future work, more rigorous studies should be conducted to yield comprehensive insights into the efficacy of AI-powered conversational agents such as Wysa.
The content of this article was last updated on December 3, 2023