Like many others, I’m currently fascinated — and a little terrified — by rapid advancements in AI. While the tech seems like it could be used for good, several applications of it leave me worried. Websites have replaced human writers with error-prone robots, Hollywood refuses to protect its creative talent from the tech, and AI-generated games like Sumner have raised red flags about bot plagiarism. Though what’s concerned me most in the last few months is the existence of AI therapy.
There are currently a handful of services available that automate therapy in some way. Woebot is an “automated conversational agent” that’s being positioned as a personal mental health tool. Users can check in with it every day to have brief conversations with a chatbot who will send over wellness tips and videos. Wysa, on the other hand, pairs users with both a human mental health professional and an AI coach that helps them work through their emotions. Considering how much traditional therapy relies on connection with a real person, the idea of automating it seems like a recipe for disaster.
There’s a reason all of this especially intrigues me. That’s because of a little visual novel called Eliza. Released in 2019, the indie gem quietly predicted AI’s troubling move into the mental health space. It’s an excellent cautionary tale about the complexities of automating human connection — one that tech entrepreneurs could learn a lot from.
Set in Seattle, Eliza follows a character named Evelyn Ishino-Aubrey who begins working at a new tech venture created by a fictional, Apple-like megacorporation called Skandha. The company has created a virtual counseling app, called Eliza, that offers AI-guided therapy sessions to users at a relatively affordable price.
Eliza isn’t just a faceless chatbot, though. In order to retain the human element of face-to-face therapy, the app employs human proxies who sit with clients in person and read generated responses from the bot in real time. Skandha claims it has its methodology down to a science, so proxies are forbidden from deviating from the script in any way. They’re simply there to add a tangible face to the advice the machine spits out.
The game resists the urge to present that idea as an over-the-top-dystopian concept. Instead, it opts for a tone grounded in realism, not unlike that of Spike Jonze’s Her. That allows it to ask some serious and nuanced questions about automating human interactions that were ahead of their time. The five-hour story asks if an AI application like that is a net benefit, making something as expensive as therapy more approachable, or simply an exploitive business decision by big tech that trades in human interaction for easy profits.
Players explore those questions through Eliza’s visual novel systems. Interaction is minimal here, with players simply choosing dialogue options for Evelyn. That has a major impact on her sessions, though. Throughout the story, Evelyn meets with a handful of recurring clients subscribed to the service. Some are simply there to monologue about the low-stakes drama in their life, but others are coming to the service with more serious problems. No matter the severity of one’s individual situation, Eliza spits out the same flat script for Evelyn to read, asking some questions that repeat throughout sessions and prescribing breathing exercises and medication.
The more Evelyn gets invested in the lives of her clients, the more she begins to see the limits of the tech. Some of Eliza’s go-to advice isn’t a one-size-fits-all solution to every problem, and more troubled clients begin pleading for real help from an actual human. Players are given the choice to go off script and let Evelyn take matters into her own hands, a move that has some serious implications for both her job and the well-being of her clients.
It isn’t always the right answer. While some of her advice gives clients the help they need, others find themselves spiraling even more. Her words can get twisted around in ways she didn’t expect, something that the safe algorithm of Eliza is built to protect against. Is it safer to stick to the sterilized script or at least try to make a real connection? And does tech like this ultimately hurt more than it helps, or vice versa?
Eliza doesn’t answer those questions, leaving it for players to chew on. It’s a thoughtful interrogation of modern tech that’s only become more pressing given the emergence of services like Wysa, which are dangerously close to the game’s fictional tech. Whether you’re a supporter of AI tools like ChatGPT or firmly against them, Eliza will provide a thoughtful cautionary tale about the limits of both machines and humans.
Eliza is available on PC and Nintendo Switch.