Journal #7: Critique

The Visual Novel “Eliza”, a direct reference to the original AI therapist Eliza, is a wonderful, depressing little game that examines and critiques the intersections between technology, AI, and mental health. The story follows Evelyn, one of the main programmers of the Eliza project who left after a workplace fatality, only to return a decade later as a “proxy”, or an individual who works as a therapist, but only says what the AI tells them to. This game covers a lot of major issues, including the role of Technology in the forced assimilation of cultural narratives and goal making, the gamification of mental health and the workplace, and offers a magnifying glass to a peculiar question: what is human interaction in the face of technology? When a client comes in for a session, they are in fact talking to a human, and hearing words spoken by a human. But are they not talking to the AI, given a human face? Many of the clients seem to only temporarily feel better only to return with exactly the same issues, none of their issues having been resolved. The solutions that the AI gives are “general”, exercise, listen to soft music, take your medicine, and so on, but root causes and specificity is largely ignored.

What I particularly like about this game, is that while there are many messages the player can derive from the experience, none of them are either hamfisted or necessarily even “preferred” by the narrative. In other words, yes, it looks like an AI might take over the field of therapy, but the story offers evidence of both its benefits and its drawbacks, and through the players actions, allows the player to decide how this particular narrative plays out. In other words, the player actively participates in the politics of the experience. But, in my experience with the game, I certainly felt a sense that the AI could help some people with surface level issues perhaps, but that due to the true lack of understanding, the deeper issues were always left unsolved. Or, as Bogost puts it: “The computer program has no real understanding of the meaning of the user’s input; rather, it is taking that input and spinning it into a conversation. Eliza is a machine for generating conversations according to procedural rules.”(Bogost)

Full Game No Commentary:

Work Cited:

Bogost, I. (2006). Playing Politics: Videogames for Politics, Activism, and Advocacy: First monday. Retrieved March 03, 2021, from