By the end of this blog post, your greatest fears or hopes may be realized.
Woebot Health, a technology company founded in 2017 by veteran clinical psychologist Dr. Alison Darcy, offers users the on-demand mental health support of a chatbot, named Woebot. Care is delivered in the form of a text-based conversation between Woebot and a human on an iOS or Android mobile app. Woebot’s communication with users is rooted in the dogmas of Cognitive Behavioral Therapy, Interpersonal Psychotherapy, and Dialectical Behavior Therapy. The conversations are generated using AI’s Natural Language Processing (NLP) capabilities. Woebot reports that 1.5 million people have downloaded it as of September 2023.4
Screenshots and video below reflect real conversations between patients and users of Woebot. Users can chat with Woebot 24/7 and expect instant responses. Woebot can be trained to “nag” users into discussing how they feel at different points during the day, and users can select a different theme of support they’d like (sleep hygiene, anxiety, etc.) The app also features a graph for users to track their daily moods. Woebot’s content is targeted to three populations – adults, adolescents, and new mothers.
Woebot’s use of AI is limited, but evolving and expanding rapidly. Currently, every message Woebot sends originates from a bank of statements and questions pre-approved by conversational designers and mental health experts. AI’s role is scoped to learning about users’ patterns and problems and habits of speech. But Woebot has big plans to change that. As of July 2023, Woebot announced that it would begin research on the safety and effectiveness of integrating Large Language Models (LLMs) into their technology.4,5 The deployment of LLMs (a type of generative AI) might mean that Woebot will start to craft its own correspondence. (Think of this foray like ChatGPT, but for managing your deepest fears and worst nightmares.) Upside is high, downside is catastrophic, but its occurrence is probably inevitable.
Woebot’s success hinges on the fact that (1) contemporary demand for high-quality mental health care dwarfs supply (of human clinicians), and (2) the depth of relationship that Woebot can form with humans is on par with the kind of relationship that humans would form with a therapist. The first point is industry-agnostic – one of the most common value propositions for software development is that it is useful to scale delivery of services. Analysis of the second point can lead one think Woebot is a nightmare dressed like a daydream.Research conducted by Woebot in 2021 concluded, from a sample size of 36,070 over a two-week period, that users felt empathetically connected to Woebot.6 (Supposedly, forming this connection took between three and five days for most users,) Most users interacted with Woebot every day. Their depressive symptoms also diminished, per self-report. This study is corroborated by thousands of positive reviews on the App Store and PlayStore. It has average 4.7 and 4.1 (out of 5) starts on each respective store, and people describe it as “funny” and full of “much insight.”7 Often praised is Woebot’s mastery of the re-framing technique common in CBT. (An example of this would mean helping the user characterize a particular endeavor, like an earning a poor grade, as a failure instead of calling him or herself an abject failure.) Where Woebot falls short is in its ability to manage people through acute crises, like suicide. (Its NLP can detect alarming language, but it isn’t trained to be a crisis intervener.) Barring the questionable fidelity of self-reported and sponsored feelings and findings, that people feel more connected to and comforted by a two-dimensional robot than they are by the three-dimensional human beings around them is profoundly sad.
Below are a couple of examples of Woebot’s descriptions of how its adult and maternal modes build trust:
The good news is that sadness is Woebot’s gas and electric. (In case that joke didn’t land, that is a play on bread and butter.) Depression and anxiety don’t appear to be diminishing in frequency anytime soon, and if Woebot’s LLM experiments go well, it could start to replace human therapists. Obviating the need for human therapists will be a big challenge, though. To scale the offering of high-quality healthcare, Wombat will need to hire many highly-educated specialists in psychology, medicine, and data science. These experts are in chronic short supply, so shrinks can expect their couches to be kept warm for a while.
To use Woebot, a person must have a smartphone and be part of either a research study Woebot is conducting or a member of one of Woebot’s partner organizations, which are mostly healthcare delivery or employee benefits companies, including Virtua Health, Curai Health, and PayrollPlans. This is a departure from the company’s previous business model, which permitted unaffiliated individuals to engage in a free trial before opting into a paid subscription plan. (Yours truly considered accepting a job as a software engineer with Woebot when the app was more accessible, two or three years ago.) It is unclear why leadership transitioned.
This analysis would not be complete without a review of the incipient government regulation. The intersection of healthcare and AI, like the intersection of AI and every other field is nascent and dynamic. The companies who build the runway, though, are likely going to be the ones that shape how they are policed, and Woebot is no exception to this rule. Woebot proudly boasts that much of its regulatory compliance team were formerly employed by the Food and Drug Administration.8 (The infamous Sackler family used the same playbook for Purdue Pharma to keep lethal opioids on the market for as long as possible. Woebot isn’t selling heroin, but its strategy might help exclude competitors for a while. The proliferation of chatbots across all verticals means that without regulation, Woebot might not be able to maintain its moat.) The organization gets IRB approval for the studies it runs, and it promises that it does not share or sell data to advertisers.8 (Nothing is stopping it from doing that though, in the future.) It remains to be codified if the sacrosanct doctor-patient confidentiality will apply to humans and Woebot. The executive branch of the US federal government is laying out comprehensive guidelines and watchdogs for how AI will be used from the government’s perspective – how that policy will influence private companies’ decision-making will be a highly-consequential nail-biter.
That’s all! Now woe away.