Tuesday I was invited to speak at the interactive narratives summit at the London Games Festival, specifically in a debate over whether AI can create a good story.
Perhaps the original scheme was to start a good showdown, but I have somewhat complicated views about what the question even means, and my would-be debater Brenden Gibbons did also, as it happens. So instead we had a more temperate but I think more interesting conversation, moderated by David Tomchak.
This is not a transcript of that conversation, because I can’t do that, but it’s an attempt to recapture some key points, drawing also on notes I made before the event, and expanding some of the ideas with links or examples I didn’t have available in the room.
First, AI can definitely already create stories, by pretty much any definition that a narratologist would establish. Indeed, we can set the bar higher than just “is there a sequence of causally-linked events,” though many scholars would accept that as enough. Some of GPT-2’s output is interesting, funny, and narrative. So are the outputs of other techniques stretching back to the 70s, from generative grammars to the model-and-curate approach used by James Ryan in his recent dissertation Curating Simulated Storyworlds. If AI were an orchard, we would have already plucked many and diverse story fruits there.
Two. Fewer of those techniques reliably create coherent stories every time they’re run. There are some strategies to deal with this — hence James Ryan’s emphasis on curation above — but it means that of the ways to make a story with AI, many are unsuited to placement in the runtime of a game, where you want to make sure the player will always have an acceptable experience. If AI story generators were cars, a lot of them would be lemons, constantly breaking down at the side of the road.
Still, there are some techniques, using grammars or certain other constraints, that have a high likelihood, tantamount to a guarantee, of making something one would acknowledge to be a story. So, next question:
Three: what do we mean by good?
Does it need to be novel? How novel? The output of a grammar can surprise the author, but typically less so than the output of a neural net model. On the other hand, the output of a grammar may be more surprising to a reader, especially if the NN model is trained on a fairly uniform corpus of sample text, since in that case the resulting text may pretty much read as parody or a trope-heavy recasting of a genre. There are some additional possibilities — conceptual blending approaches, style transfer — that allow AI to map different sources together in surprising new ways.
Does it need to exhibit the properties we might describe as creative? (If you’re interested in how we even define creativity in the context of computationally generated art, I recommend the past proceedings of the International Conference on Computational Creativity.)
Does it need to be true, in the way fiction can be true? Are we asking for a story that teaches us something new about the world, reveals a viewpoint or experience we haven’t had ourselves, offers a critique on our existing systems, or otherwise does the work we tend to associate with art or literature?
This is where I think we tend to expect the least from AI; and I would agree that if what we mean by story is “an expression of an author’s subjective experience,” then until we have AGI, either an AI cannot write stories, or an AI can only be a tool to realize the stories of a human creator.
Still, AI creations do sometimes reveal things to us about ourselves from a perspective that is not human, precisely by showing the strange patterns in our behaviour that we might not recognize ourselves. It is perhaps worth considering that at its best, AI will tell us stories that only AI could tell, and they’ll have different qualities from the stories told by people. (This is something I talked about in my Gamelab talk in Barcelona a few years ago, and I stand by it now.)
If AI were a great storyteller, it would be an alien bard who has studied our planet from orbit and still doesn’t get it; but the tales it tells embarrass us with their insights all the same.
Fourth point. Why would we want AI to tell stories?
For me, this is not chiefly about replacing human authorship; there are lots of authors, and way more people who want to write game stories than who actually have that job. If our only point for AI is to take away work humans dislike, then there are many many possible AI deployments that come before storytelling. Most people I know like telling stories at least in recreational and social contexts, and have a natural ability; that includes small children.
However. The interactive realization of stories, the creation of dynamic experiences that adapt around the player’s intervention, and this is the point where I come in, because I started to need procedural techniques when I wanted to build experiences with more narrative agency and a more adaptive world than I could possibly account for or write myself. I want AI to help me tell my stories, or the stories that the player and I create together.