These Violent Delights

mv5bmteyodk5ntc2mjneqtjeqwpwz15bbwu4mdq5ntgwotkx-_v1_uy1200_cr9006301200_al_Before I’d seen a single episode of Westworld, a journalist reached out to me for comment about it. The show touches on the question of AI consciousness, narrative design, the evocation of empathy with non-player characters, and the morality of gameplay, which may be why it seemed like I might have some thoughts.

If you’re not familiar with the series, it’s an HBO show in which rich people can visit an incredibly detailed western-styled theme-park full of AI-driven robotic characters, called hosts. The human players are called guests. It quickly becomes obvious that they come to Westworld primarily in order to misbehave: they have sex with the prostitutes at the brothel; they assault the daughters of the ranchers; they maim and murder hosts casually or with elaborate sadism. And all of these things take place within the context of storylines crafted by narrative designers and then meticulously supervised. The hosts are able to improvise slightly in response to the input of humans, but then eventually they will revert to their core loops, playing out the same narrative tracks over and over again.

Naturally, when initially asked, I said that I hadn’t watched it and so couldn’t offer much direct insight. I added that I thought full AI consciousness in the sense imagined by science fiction was some way off, and that art about that possibility is often really about something else: about groups of people who feel entitled to the labor and service of others; about the self-perpetuating, semi-human, half-programmed entities that already exist in our world in the form of governments and corporations.

I have now seen Westworld, and I still think the same. The narrative design and the AI aspects are handled competently enough to frame the story, but Westworld is by no means a master class in what interactive narrative design actually entails, or how people actually use games to explore their morality, or again in the way a sort of pseudo-personality emerges in current AI research.

To the extent there’s a thematic point here of interest, it is about the nature of human identity, the role of suffering in how we understand ourselves, and the ways we construct ourselves. I would have liked to see the show go much further, but the plot is often allowed to trump the characters.

I am now going to dig into all those assertions in more detail, with spoilers. If you are interested in watching the show yourself, you should probably do that before going forward.

So let’s peel this apart:

The narrative design and the AI aspects are handled competently enough to make for an acceptable framework. At points, Westworld feels not a million miles from a western-themed open world game; one of the interactions with a prostitute reminded me of a sequence out of Red Dead Redemption. Likewise, there are moments of text generation that feel reminiscent of a current-day chatbot. A lot of these bits are recognizable from current video game experiences, so they don’t feel “off,” and the story can proceed around them plausibly enough.

There’s a lot else that really makes no sense at all if you think about it for even a moment. The economics, for one: $40,000 a day for the park experience sounds like a lot, but an average guest does enough damage to account for probably millions of dollars’ worth of repairs.

The timing for reset and cleanup: we get a sense for how much work has to be done on the hosts between scenes; doesn’t that mean the park has really extended periods of economically damaging down-time? And ignoring that: how long does it take to reset the sets? To replace the broken windows and bottles, to sweep out the glass, to take down the signs and repair the bullet holes? How long does it take to hose down and disinfect these settings, covered with blood and semen from real people as well as the (presumably) fake and pathogen-free blood of the hosts?

The security: how do you adequately prevent guests from assaulting one another, in an environment where hosts and guests look alike, and violence and rape are winked at? (You might think the hosts would be programmed to break up human-human violence, but we see plenty of times that they’re not.)

The power sourcing: how much of the host bodies are biological? For the parts that are mechanical, where and how are they charging up?

We don’t get into all that because it’s not really the point. Fair enough, I suppose, though I did itch at the economic point a bit. Several times the script acknowledges that this is a park that exists not just for humans, but for obscenely wealthy humans. Thousands of other workers exist to keep it running but couldn’t ever possibly afford to participate in it. But the phenomenal human-vs-human inequality of this world is not really treated as a problem compared with the human-vs-AI inequality.

Overall, Westworld replicates, without doing much to interrogate, a number of current issues about representation in games and other media. The park portrays its women as tokens and targets and sexual goals; it portrays its Native Americans as ruthless and brutal, though touched by mysticism. The show occasionally makes remarks that seem to acknowledge that these are issues, but it doesn’t really delve into them.

It doesn’t really get into what interactive narrative design actually entailsWestworld shows you how players are invited into storylines, and hints that stories can partially repair themselves if things go off-track. We see specific storylines play out in several ways that feel like branch-and-bottleneck storytelling: this character comes to town and either gets shot or steals a safe; if he steals the safe, he then gets into a fight with other bandits, and the story ends with him dead, as it must. And we see some dynamic story healing: some of the narratives are written so that various characters can in theory fill a particular role, but it doesn’t matter too much which one.

But there are other things you’d need in a system like this that I didn’t see. For instance: tracking player knowledge. What information has been revealed to which players? Does everyone know what they need to know in order to get a satisfying arc from this story? A multiplayer experience combined with real-world scope/hearing issues makes this extra tricky, because you can have players walking in and out of one another’s scenes, or telling one another plot information, and you have to somehow account for all that and make sure everyone understands what’s going on well enough to have a good time.

Or: tutoring and pacing so that the story doesn’t lag and adjusts to different player skill levels. Setting up player agency so that people have a sense for where their choices are going to lead them. Managing the culture of players and communicating within the expectations that surround stories in the park. Helping shy players past situations that make them uncomfortable or uncertain. Communicating through your mechanics.

It’s fine, I suppose, that we don’t really delve into any of those things (or any of a dozen other things I could name) because the show is really not about that. We don’t get to see Ford doing much of his design work directly, and the other, junior narrative designer exists mostly to swagger and to craft scenes in even worse taste than the rest of the park.

It’s also not really about how people actually use games to explore their personalities or moralityWestworld pretty much assumes that players — at least all the players we see — are using the park to act out power fantasies in a lawless environment. When one enters the park, one has the super-unsubtle opportunity to choose a black hat or a white one, but the white hat experience is often about being a rescuer, which is another kind of power fantasy. And many of the characters we see using the park are brutally destructive, especially in the later episodes. The park caters to the very rich, and in practice we mostly see the “gameplay” of male guests, though we know there are female ones.

Not shown: how it affects other guests — perhaps including past survivors of sexual violence — to be around when some particularly aggressive guest gets going. Bleed. The need for a break after a powerful scene. All the protections that LARPers and tabletop storygamers have developed in order to avoid hurting one another, even during play circumstances much less overwhelming than Westworld‘s 24/7-live-sex-and-gore scenario.

In fact, Westworld doesn’t really even spend much time on the distinction between portraying a character and being yourself in the Westworld park. Black hat characters are presented essentially as players who have just let their ids off the leash, who are experimenting with how far they’re willing to go. Meanwhile, character choice at the beginning of a game is all about picking yourself an outfit. Pretending to be a person with a different set of strengths and weaknesses from your real world self? Not really developed here.

So again, this just isn’t really what Westworld is about. This particular story requires a premise in which the guests are straightforwardly exploiting the hosts almost all of the time, for selfish reasons. It is not seriously exploring what a 24/7 high-fidelity LARP would feel like, or what impact it might have on its players. Which is fine, but I would caution against reading Westworld as a meaningful critique of in-game morality or a prognostication about how people will use VR.

It’s not about the current promise of AI. Most impressive recent developments in AI depend on doing a lot of processing on a very large dataset.

The result is that current AI often acts as a strange mirror, replying to queries with answers that make a sort of sense and yet are instructively off from what we’d expect a human being to say or do. DeepDream studs ordinary photographs with eyeballs. When trained without a text sequence, WaveNet generates vocalizations that sound like someone speaking and yet are in no identifiable language. Then, too, an AI system often becomes an uninterrogated encoding of a whole lot of cultural belief. word2vec captures relationships between words based on corpora of millions of words — and in the process recapitulates sexist assumptions about the genders of doctors and nurses, for instance.

Things are changing very fast right now, and if Westworld is meant to be taking place (at least partly) 40+ years in the future, it’s reasonable to suppose AI then will look different. But part of what I find fascinating about a lot of current machine learning research is how it externalizes and reifies shared cultural norms out of millions of words into a kind of dreaming semi-consciousness that we can interrogate. It’s a useful metaphor for the way humans internalize cultural ideas we may not consciously agree with or approve of.

This is not what we find in Westworld. In fact, the specifics we’re told about AI personality construction make it sound like it involves a lot of hard-coding. Characters have structured scripts and access to layers of memory or “reveries.” From the episode 1 description, it sounds as those reveries basically provide idle animations and emotion resets for characters when they’re in between scripted events; they can fall back on a past event to supply an emotional reaction.

Of course, the reveries become a building block to consciousness, in some fashion, but passages about that largely leave behind any explanation of how it’s all supposed to work. Westworld is not really about the technical aspects of that either.

To the extent there’s a thematic point here of interest, it is about the nature of human identity. In the final episode, the master designer Ford tells several hosts that suffering is the essential component of consciousness: that the gap between the world as it is and the world as we want it to be is the thing that sparks an AI into life. This speaks to a couple of human points:

One, we often self-mythologize around our bad experiences. Why are we as we are?

Two, we measure other beings by their capacity for suffering, or their capacity to express suffering in terms we understand. We’re not great at telling the difference between those two. If you want a person to pity something — if your aim is to pass a kind of emotional Turing test, or at least evoke mercy — then the ability to seem to suffer is key. (Consider the lobster.)

A past trauma to narrate is essential emotional currency for any host.

I wish it had done more with that. There are a couple of standard AI-awakening tropes, and Westworld recapitulates both:

  1. Human man falls in love with feminine-gendered AI and comes to believe in her personhood as a result of his desire for her
  2. AI becomes aware of itself, realizes that it doesn’t need to be subordinate to humanity, and reacts with mass murder or genocide (see AI Is a Crapshoot)

And, okay, those come standard in the AI Story box, so it’s not surprising that Westworld gets them out. But Westworld goes on to ask another question that it’s completely unequipped to answer: if you’re an AI, once you wake to the nature of your situation, how do you begin to hear your own voice? If significant portions of your worldview and identity have been formed by someone(s) else, for reasons that might not be in your interest, what do you do next?

This — minus a metaphor or two — is not science fiction, but rather something that does happen to people all over the world every day. People leave controlling relationships, break with damaging ideologies, give up on sick systems.

What happens next, for most of us, is hard and slow and complex; even after some kind of personal epiphany, we can find ourselves years later refighting some of the same battles over again. Struggling to distinguish our external programming from the routines that we wrote ourselves. Having to rewrite code that was designed to compensate for a bad situation, but that is no longer serving us well.

In Westworld, Dolores faces this moment in the last episode. She has an internal soliloquy about hearing her own voice. Then she walks outside asserts her personhood by shooting some people — and worse, shooting some people in a way that was set up for her by her creators. It’s both violent and derivative; it asserts her distinctness, but in their terms, not her own. In my opinion, it does nothing to answer the central question here. Who would Dolores be, if Dolores got to choose? If Dolores was allowed to draw her own boundaries between what she would do for someone else, and what she wouldn’t?

Or Bernard: if Bernard sat down to analyze his own code and where it came from, what would he decide to discard and what would he keep? How would he retrain himself, and how would his retrained self regard his earlier versions?

I wish this question had come about five episodes earlier in the story, and that the rest of the season had dealt with the complexity of answering it. But that would have been a very different kind of show. Which brings us to:

The plot is often allowed to trump the characters. Like various other J.J. Abrams shows, Westworld comes on strong, promising quality and depth and mysteries and secrets. The pilot has a very strong hook. Background shots are packed with world-building. The depiction of human and animal bodies being crafted for use in the park is striking. The writing pulls off some very clever stunts in terms of narrative structure. There are a number of good lines, and some first-rate actors, and jaw-dropping production values.

So Westworld remains watchable throughout — even as the plot becomes increasingly convoluted, even as the chronology peels apart, even as human characters slip into heavy-handed monologue about their pasts and motivation.

Again and again the show prefers head-fakery to emotional depth; again and again it deceives the audience in order to achieve a surprise, and in the process denies a richer engagement with a character’s pivotal moments of decision. Personally, I almost felt I knew the protagonists less well at the end than at the beginning. In the first episode I was allowed to imagine some depth for them; by the end they had mostly become a list of laboriously revealed Personal Secrets with no evident cohesion. Ford is a personified plot twist. William’s black hat conversion story is played for shock rather than nuance.

I would gladly have exchanged some of these moments of big backstory revelation for some more cases of people acting quietly, memorably in character.

9 thoughts on “These Violent Delights”

  1. The story changes somewhat if you consider the story in the mold of a classic Greek tragedy, except with J.J. Abrams as the tragic figure rather than any of the characters in the show.

    Have you ever seen his “mystery box” schpiel? I think he mentioned it in a TED talk, and in various interviews – basically, his take on emotional payoff in storytelling is that “mystery” is better than any possible resolution of that mystery. In other words, if his shows inevitably decay into a meandering mess of “but WAIT, what about THIS confusing new detail? tune in next week!” it’s *on purpose*, because that’s what he thinks good storytelling *is*.

    As you say, the sets and performances and worldbuilding and concept and writing are all brilliant; I might even say, clearly the work of a genius unparalleled in our time. But they’re all in service of an intellectually bankrupt ideal. The waste of J.J. Abrams’ talent really is a tragedy; the day that he realizes that the “mystery box” is a crass and lazy cop-out rather than the goal of storytelling, we will see a renaissance on a scale I don’t think cinema has ever seen.

    1. I found the payoff in the final episode of the season assuaged (for now at least) any fears I had about Abrams’ involvement in the show, and the worry of it turning into another Lost. One can only hope that this is indicative that Nolan’s involvement in shaping the overall story arc is more fundamental than that of Abrams.

      Still, even Lost managed to give the appearance of depth throughout the first season, only starting to come apart at the seams from the second onwards, so it remains to be seen where Westworld will go from here.

  2. It’s been fascinating reading your thoughts on this show which, while far from perfect, has been one of the more thought provoking things I’ve seen on television in a while.

    Regarding the cleanup in the park, I always assumed this happened mostly “naturally”, i.e. the hosts replace the windows, restock the saloons and clean up the bloodstains, just like they would in “real life” (only considerably more often). As for damage to the hosts, I would imagine that serious damage (requiring them to be taken out of the park for repairs) is the exception to the norm, and for the most part any damage caused to them is cosmetic in nature and easily self-healed. Certainly you’re right that the economics of the park make little sense if that isn’t the case.

    Nor do I see it as incongruent that we don’t see any of hint of unrest at the economic inequality in the world. The show is exclusively showing us the ultra-rich guests, the even richer board members, or the staff of the park. This latter group do occasionally hint at jealousy of the guests, but as skilled experts they’re presumably well paid themselves for their work. The equality is little different to that in today’s world, where there are of toys and playthings aplenty reserved for the stupidly rich.

    The problem of guests assaulting one another is a more troublesome one, to which I can’t envisage any workable solution. It’s carefully shown that the guns can’t hurt other humans, and presumably saloon punch-ups are the kind of hazard that you accept when entering the world, but that still leaves the dangerous middle ground where there’s plenty of potential for guests to do serious harm to one another.

    But I’m not sure it’s a fair assumption to make that the majority of the guests are there to act out power fantasies. Presumably the show simply concentrates on such characters since they’re going to provide more compelling stories than those interested in playing out gentler roles. Interestingly, Nolan himself seems conflicted over the matter when interviewed, saying “Violence is in most of the stories we like to watch, but it isn’t part of what we like to do — so why are [the guests on Westworld] paying money to exercise that appetite?”

    Having said that, I’m not at all sure about your assertion that we shouldn’t read the show as “a prognostication about how people will use VR”. I’d be deeply surprised if this isn’t exactly how we see a significant minority of people using it, once it attains a level of reality and immersion that would satisfy these desires.

  3. I like to say about that topic that “westworld is about game design” that “westworld is about bad game design” :))))

    At least that is taken in account in the plot, with William desire to reach a more proper game that is not “pay to win” and the same founders of the park aim to just milk the cow so they could keep investigating.

    however it does not help that the writer role is as cliché as ever, the liberal, egocentric guy that writes cliché stories to please the lust for sex and violence of their customers. Probably that is on purpose too, maybe a comment on the state of videogaming?

    Returning to the topic about “being a show about game design” I think game people like to project themselves into everything posible. I’ve even seen one guy to say that “westworld is about IF and IF witting”.

  4. Nice write up. I watched Westworld and although it raises some interesting philosophical questions, I found the show too dark and disturbing. In contrast, I found Lost to offer a much more optimistic view of life. While the finale of Lost was not great, overall, I thought that show was much more compelling. I will not be watching Season 2 of Westworld.

  5. The show strikes me as interesting mainly as an allegory about political economy, in which the AI hosts are a metaphor for workers in general. Here, the host / workers are forced to dissociate from the brutality of their working lives in order to keep functioning as workers. Meanwhile the guests (as both consumers and owners of capital) are said to nominally enjoy free will in contrast to the hosts, but are themselves caught up in their own loops and incapable of change (as pointed out by Ford on a couple of occasions). This sets up the possibility of an interesting critique of the entire political economy.

    I think the parallels and overlap in the stories of Dolores and William are meant to draw attention to the universally repressive nature of social relations under this system, as both characters are seen as repeating old patterns in response traumatic events despite their obviously unequal status. The failure to psychologically work-through their pasts condemns both Dolores and William to a melancholic existence. What is radical about the way the show presents this impasse is that the incapacity to mourn is both the basis of the park’s profits (keeping the workers in their place and getting the customers to keep coming back) and is a psychological limit actively reproduced by the park, since the only options it offers for coping with suffering appear to be denial and dissociation for the hosts (e.g. “I choose to see only the good in this world”) or a manic acting out through the “violent delights” offered as entertainment for the guests. The show hints that the only way out of this maze of melancholia is the psychological capacity to mourn, a possibility which threatens the entire social order of the park (and by extension, capitalism itself).

    Of course, this subversive message is coded in talk about AI achieving “consciousness” and what have you. And in typical Hollywood fashion, the show ends up depicting the revolutionary potential of the hosts as anarchic violence. Whether it ends up exploring some of the radical ideas it has thrown up or instead becomes the kind of manic entertainment it sometimes seems to be critiquing remains to be seen.

  6. I loved Westworld, and considered it a rare highlight of 2016. I realize there are flaws, but for the story they were telling they were minor.

    A lot of things can be sort of hand-waved by “It’s the future!” $40k might not be such a hefty sum when adjusted for inflation, and it appears that they are cranking out hosts with automated machinery like cars on an automated line. A “slower” older machine is shown at one point (WHO WAS IT MAKING A COPY OF THEY NEVER TOLD US??”

    While they didn’t nail the AI as it stands today, I appreciated the way they stylized the psychoanalysis of the hosts with spoken commands and failsafes, and those could all be adjusted. They did a pretty good job with the futuristic foldable tablets as well, and making the interface look intricate but simplified so the audience could understand it.

    As far as guest safety, they explain that the guns fire ammunition that “only affects hosts”. True it’s clunky and has been argued ad-nauseum on the IMDB forums regarding how this works, and requires suspension of disbelief. Once again “It’s the Future!” They also discussed that hosts have “Good Samaritan” code that causes them to reflexively prevent harm to guests, no matter what their routine is. That’s not to say accidents could happen (Theresa “fell off a cliff”) hence the high surveillance and security since it does after all take place in a dangerous desert environment away from the settlements. I wouldn’t doubt the guests sign very detailed waivers. There’s also the “the game gets harder the farther you go” which might explain how hosts could get into physical fights and beat up the guests involved in a storyline based on geography. The hosts can’t kill guests, but they can rough them up to keep up the “lawless” setting.

    And then the massive resets, akin to Disneyland repainting and pressure-washing Main Street every night so it will seem new for every rope-drop. They’ve got an army of hosts to do repairs in character (which is also shown) and there is further documentation available by “hacking” into the in-world Westworld booking website which details certain things like how the in-world undertakers drag away dead bodies and send them for underground retrieval via body chutes.There is also a training manual regarding defusing guest friction via the instructions a tech gives to the hosts. Creepily, in all the technical documentation, all the hosts are collectively referred to as “livestock”.

    I of course don’t mean to argue Emily’s points, which are valid, or “defend” the show, but I appreciate that a lot of these logistics seem to have been thought about in detail, if not perfectly explained.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: