Where tech aligns

Life After Westworld

The acclaimed HBO series was cancelled right when we need it most.

Dystopia is a crowded field in today’s television. As early twenty-first century reality catches up to the cyberpunk of the 1980s, streaming and cable television takes to bridging between the Way We Live Now (Mr. Robot) and the world inevitably to follow (Handmaid’s Tale, Upload, Black Mirror, Y: The Last Man, among many others). HBO’s Westworld stood apart even from that critically lauded selection, always seeming to aim higher and with greater intelligence. The storyline that initially targeted the thorny ethical questions that arise from AI’s theoretical sentience came to incorporate broader, more ambitious issues about social equity, the quest for immortality, the extinction of humanity, and the ability to control human fate. 

The series was planned to round itself out in five seasons. So it came as a shock and a disappointment to find it had been canceled after the end of the fourth. Even though the fourth season presents a reasonably satisfying coda to the show’s arc, creators Jonathan Nolan and Lisa Joy – working loosely off the premise originally developed in Michael Crichton’s 1973 film – had intimated that a fifth season was forthcoming, making the decision to cancel all the more unforeseen.

The cancellation is also out of step with HBO’s habit of renewing shows of far more dubious popularity, and of far less social relevance. Why cancel a show that begs fundamental questions about the dual pillars of our reliance on AI and on the benevolence of the engineers who created it? With all of our collective attention relentlessly siphoned by the smartphones in our palms, we are well advised to interrogate the cost at which we pursue the gratification of our pleasures, both the mundane and the more “boutique.” 

And the pursuit of these pleasures is indeed the subject that Westworld – in its early seasons – explores. The wild-west theme park offers its clientele equal parts “horror, romance, titillation” as it stages loops for its robot hosts – virtually indistinguishable from humans – to entertain their guests. Prostitutes with tight bodices and ring curls satisfy the deprivations of the world’s elite amid the backdrop of dark saloons and sunset canyons. It is as though Sergio Leone had gotten hold of the screenplay for Eyes Wide Shut and suffused it with a major dose of Phillip K. Dick. And the series pursues questions that Dick, over the course of his literary career, doggedly forced his gradually expanding public to confront. 

The source of shock over the cancellation is not, perhaps, a reaction to prematurely curtailing a series that lacked sufficient closure or intrigue. Those viewers who stuck around through the fourth season were treated to an elaborate enough sequence of narrative maneuvering and plot twists, so much so that the convoluted storyline is what probably caused the show to start hemorrhaging its popularity in the first place; ratings plummeted, the initial hype frittered away, and the network could no longer justify the massive expense of its production.  

It seems – paradoxically, but not uncommonly for sci-fi projects of scale – that the show’s ambition is what killed it. Over the course of four seasons, the writers play so many narrative sleights of hand that many viewers likely walk away from individual episodes unable to summarize the intricacies of the plot, much less reflect on the broader social and political commentary that the series attempts. 

In my view, the series erred in two key ways. First, it pursued and then abandoned any sort of straightforward allegorizing that would make it accessible to a lay audience. Take, for example, the allegory of the maze. According to Robert Ford, the park’s creator, consciousness isn’t a journey upward but a journey inward. The path of AI from host to human is best represented not by a pyramid but by a maze. In season one, Ford diagrams the ways in which memory and improvisation combine in the hosts to approximate a consciousness that can only be achieved by entering the maze and reaching its center. This was the most compelling and unifying element of the first season that ended up failing its viewers because the allegory of the maze, with the park itself, is replaced in favor of the futuristic cities in the later seasons. 

The second way that the series erred was when it continuously undervalued the significance of death as a narrative device. Beginning in the first season and continuing throughout the remainder of the episodes, the question of which characters are alive and which are dead – as well as the question of what death means for AI and human alike – is as muddled as the question of who is host or human. We watch the same characters die, become resurrected, and then die again, until we lose any sort of meaningful relation to them as autonomous agents that might be subject to the laws of mortality. Characters weave in and out of the plot until, by the fourth season, viewers have too confused a picture of Dolores-as-Christina’s sentience to feel the appropriate rage at her mistreatment or catharsis at her triumph.

This confusion dovetails with a broader sense of consumer ennui, as viewers are weary from grappling with the complexities of the AI-human relationship, and from wondering how to apply lessons from fiction to their workaday relations with technology. Pondering the nature of consciousness is the mis-en-abyme that threatens to send the most circumspect among us into spirals of existential confusion. The sheer volume of cautionary tales about ignoring the pitfalls of creating consciousness over the course of the past ten years have inured us to the problem’s immediacy. 

Confusion and ennui are compounded by the Cassandra Effect in sci-fi writing: those with the prophetic sense to tell the horrors of the future are fated to tell it with startling accuracy, and to never be believed. But worse than the issue of disbelief, the status of “entertainment” to which we relegate our culture’s most promising cautionary tales obliges their creators to dilute them: to make them palatable to a viewership short on attention and big on visual stimulation. 

In the pilot episode, a quote from Shakespeare’s Romeo and Juliet is repeated like a mantra by various characters revealed to be robots: “These violent delights have violent ends.” So says Friar Laurence, foreshadowing the young lovers’ deaths and the destruction of their warring families. If we extend this adage to the moral of Westworld, then we are obliged to concede that sex and death are twin sides of the same coin. There is no act of violence, aggression, or sexual depravity that one commits in a vacuum. AI is not the sponge that will obligingly sop up the worst of our hedonism and send its file to the trash icon, returning tabula rasa to a purified state. Even machines have memory. Even AI holds a grudge.

This broaches the question that the series endlessly raises: what will we do when the tables have turned, when AI backs us into the corner where we have relentlessly relegated it as something non-human and unworthy of ethical consideration? What will we do when algorithms learn and apply the elements of human nature that debase us: our rancor, our impulse to approach justice retributively, our lust for dominance? 

If there is a negative reaction to the cancellation of Westworld, it is because its writers spent four seasons failing in their promise to deliver what viewers ultimately need: a cathartic enough ending that might flatter our sense that human ingenuity in storytelling has the prophetic sense to deliver us from the worst of all possible dystopias. If we can’t stop AI from gaining sentience, if we cannot wrest the power from feckless engineers to slow the rapid progress of machine learning, at least we can engage the questions it raises with grace and foresight. Right?

As we approach an age of hyper-reliance on artificial intelligence, cautionary tales must be decipherable and forthright, or else they risk becoming perfunctory: we must apprehend their morals intuitively, or else we risk digesting them passively, as though they were a party slogan to be repeated ad nauseam but never truly integrated. 

With the cancellation of Westworld, we have lost an opportunity to meaningfully reflect on what it means to be a human, what it means to be a robot, and what it means for the two categories to partially overlap. The singularity haunts us, but neither fiction nor art can prepare us for the worst of its implications. Our visionaries can only offer us tepid entertainment – circuitous storylines that lull us into the illusion that we have watched something profound – as we sink ever further into loops of distraction and the pursuit of ambivalent pleasures.