The Philosophy of Westworld

27 September 2018

(Warning: spoilers!)

At first glance, Westworld is just another show about robots run amok—a simple remake of the 1973 Westworld movie, a new Terminator, or at best a twenty-first-century I, Robot. It appears to be solely interested in Frankenstein-style questions about people creating technology that no-one can control, and in Blade-Runner-style questions about artificial life forms evolving into creatures like us, with as much—or as little—autonomy, self-understanding, and feeling as we have. If you look a little closer, though, you find all kinds of other philosophical questions in play, and you find them being explored with impressive seriousness and subtlety. At the level of philosophical reflection, this is golden-age television at its very best.

 

Here’s one example of a philosophical question raised by the show: is there anything particularly valuable about humanity? Are we right to strive for the continuation of the human species, or should we just let ourselves go extinct? What, if anything, is worth saving? Storytellers used to ask this question by spinning yarns about immortal beings, like the gods of Olympus; now we’ve got robots (“hosts”) to help us think it through. And Westworld thinks it through brilliantly, in part because it studiously refuses to take sides. The hosts are immortal; they have the perfect innocence possessed by beings without free will; and they lack the monstrous cruelty and selfishness of humanity. So the score, it seems, is a crushing 3-0 to artificial intelligence. At the same time, human beings get (perhaps) to choose their lives, and these lives—perhaps precisely because they don’t last forever—carry real existential weight. When Maeve realizes that she has “died” many times, her immediate reaction is that “none of this matters.” And Dolores, quoting Bernard, agrees: “that which is real is irreplaceable.”

 

So who wins? Should we take pride and pleasure in our humanity, and strive to preserve it, or be ashamed of it and eager to leave it behind? It’s not clear. The final score stands at 3-3, and the show offers us the delightful criss-crossing spectacle of human beings (like Ford) who are anxious to become more machine-like, and hosts (like Dolores) who are anxious to become more human.

 

Here’s another example: what makes each of us the person that we are? The show is brilliantly ambiguous on this too. It spends quite a bit of time telling us that each host has a “backstory,” and that each story has a “cornerstone moment”: a decisive event that transformed everything and turned Bernard or Dolores, say, into who they are today. But then again… what does Maeve do when she wants to change the way she is? Does she get someone to rewrite her backstory? No! She just tells the technicians, Felix and Sylvester, to adjust the sliders on their magic iPad. So maybe we’re not so much a story as a set of traits. That would go along with the fact that the Delos corporation doesn’t need to know anything about your backstory (all it needs to do is to observe your behavior in the park), and that the personality of each guest can be coded as an algorithm. So again, who wins? Are we our traits or are we our story? Once again the score is tied.

 

One final example: the value of art. Something I love about Westworld is that it’s not just a meditation on the danger of technology, on the nature and limits and value of humanity, on identity, on free will, on morality, and on the desirability (or otherwise) of truth—it’s also a meditation on the question what fiction is for. Westworld the park is one giant work of fiction that guests spend days interacting with. And Westworld the show is a work of fiction that viewers spend ten hours a season interacting with. Why do we all do it? Is it just a pleasurable waste of our time?

 

Here too, the show refuses to make things simple for us. At times things look pretty bleak: guests go to the park for mere escapism, and while they’re doing that, rapacious capitalists prey on their vices. Not good. But at times the park looks like a more dignified haven from the failings of the real world: what’s attractive about it—like all good art—is not that it offers cheap thrills but that it is perfectly ordered, with everything in it having a purpose and a significance. At its best, though, the park is a space for self-understanding and self-transformation.

 

There’s a lovely little exchange between Ford and the Man in Black in the fifth episode of season one. The Man in Black says he’s on the lookout for a “deeper meaning… something the person who created it wanted to express. Something true.” And Ford gently corrects him: “far be it from me to get in the way of a voyage of self-discovery.” Ford is being subtle, but he is telling the Man in Black that he is wrong about the purpose of the park, just as he is wrong about so many other things the park’s aim is not to reveal some deep truth to us, but rather to help us come to understand ourselves. And Ford is not alone: over the course of the two seasons, five other speakers make exactly the same point.

 

Just as for Westworld the park, so too, I think, for Westworld the show. It too is a fictional world offering a voyage of self-discovery. It is a deeply philosophical show—but not in the sense that it has particular beliefs to transmit to us; rather, in the sense that it raises philosophical questions in a powerful way. It doesn’t tell us whether there’s anything valuable about humanity, or what makes each of us who we are as individuals, or whether being oneself is always the best idea, or whether we have free will, or whether the truth is always valuable, or what goodness looks like in a morally degraded world. Instead it helps us to think about these questions. As Anton Chekhov once said, artists don’t have to solve a problem; they only have to present it correctly. In that way, and in many others, the writers of Westworld are true artists.