There's Matthew Berman reminding us that future is coming up faster than you think. He's talking about videogaming, but the same principles apply to movies, comics, and literature.
The novel – at least, the genre novel – may well go the way of the epic poem, to be replaced by something more like an RPG session which an AI will run for the reader. (Or, more likely, the listener or viewer.) The top authors will devise the elements of the story, the characters and timeline (perhaps more like creative directors than old-style authors) and the AI will use that to tell a story that gives prominence to the bits that interest the individual reader. Did your parents make up stories to tell you when you were little? Like that.
You'll still discuss the story with friends (an important feature of most entertainment) but the specific events in your version may vary from theirs. Initially such on-the-fly stories will be trite because roleplaying has been infected by a lot of Hollywood pablum about act structure and story tropes, and that’s what the AI models will learn from. But eventually it may shake that off and become a new independent art form. "Not a line, but a bolt of lightning," as C W Longbottom puts it:
In the meantime, a market will remain – small, though, and shrinking – for grown-up fiction that doesn’t pander to YA tastes. Genre fiction falls in predictable patterns involving plot, and so is easily copied by novice writers and neural nets, whereas literary fiction is harder to fit to a formula because it usually concerns itself with the unique outlook and choices of the characters. But don't assume that because the AI hasn't experienced human emotions it won't eventually be able to write Lolita or War & Peace. Conrad didn't personally have to hack his way through an African jungle to learn how to write Heart of Darkness. It's only a matter of time before those more complex story patterns are learned and replicated by AI, just the same way that most authors do it. And then we'll be in a whole new world of entertainment.
As writers and con artists have always known, you don't need to live a life to make someone think that you have lived it. You just need to know a few characteristics, pieces of knowledge and speech patterns to give people the impression that you do.
ReplyDeleteIt just takes some good observation skills to do that. I'm sure if an AI can get hold of the emails of all the accountants in the world, for example, they would be able to fool a non-accountant, through text messages, that they are an accountant. So yes, I'm sure AI will be able to write literary fiction soon.
I don't know how to feel about it. I guess I'm worried about the slippery slope of everyone having an experience that is so unique that no one will have any common experiences. I remember hearing stories about how Quatermass bought the country together in the 1950s and, more recently, I remember the 1996 Only Fools and Horses trilogy being watched by about 20 million people out of probably 60 million at the time. Love them or loathe them, we could all watch the same thing and we could all converse about them.
Would this fracture groups of people more? Everyone has their own head canon. Would we get groups of people who think that Danaerys didn't go crazy at the end of Game of Thrones and created a just kingdom vs groups who made Jon the king, but he married Danaerys and she was OK with that vs groups who just thought that the ice zombies should kill everyone.
Society is also full of contrarians and trolls who could now mess with stories for the fun of it and give them the opposite meaning. People have made up theories that Star Wars is rebel propaganda and the Rebel Alliance are actually terrorists, Obi Wan is radicalising Luke Skywalker and Darth Vader is the real hero, trying to moderate and rein in the various personalities in the empire to keep it running. What will happen when two people can create two completely opposite meanings from the same thing? People can do that already, but now they can create their own versions of Star Wars to back it up and so no one would be able to watch a definitive version to see which opinion holds merit or find common ground.
I think the point I'm struggling to make is that don't we, as a society, need shared stories? Not everyone will like them, but at least we have those things in common? Am I attributing too much impact to this new development.
You make some really interesting points there, Stuart, and I feel this deserves an entire podcast discussion rather than a quick response in comments. I loathe social media for the effect it's had on politics and the boosting of misinformation and crazy conspiracy theories -- but realistically there's no going back. This is the new world and we're going to have to find ways to make it work. A possible answer to the political problem, for example, is not to try to recover the few, large political blocs that used to exist but to push forward with a more integrated model of democratic decision-making using something like Unanimous AI. And maybe AI can help with critical thinking skills so that people can better see when someone is peddling nonsense. I'm not sure if any of that is achievable, only that we can't go back now (unless somebody turns the internet off) so we'd better start trying to think of solutions.
DeleteAs for shared stories -- yes, I think they serve an important function. When I was at school you could cite Greek or Roman myth/history and other people would know what you meant. Not so much these days. The other day I was quoting examples from Star Trek TNG (a more recent mythology) and realized even that isn't a reliable shared story universe among SF geeks anymore. We are fragmenting into hundreds of tribes watching and reading different stories, and that I do regret.
AI looks like the way we are going. I think it was the Swedish president who got caught using ChatGPT, which he was condemned for. Which probably means that a lot more elected officials are using AI and haven't been caught. This might be the first step to acceptance of AI in running things.
ReplyDeleteIt's fortuitous that you mentioned Star Trek TNG as this discussion reminded me of Darmok where the aliens of the week have a language completely in memes. No one can understand them, so the alien captain kidnaps Picard and forces them to have a shared experience so that a) Picard can work out how their language works. b) create a new word.
The captain even sacrifices his life to do this. Which, to me, shows that shared experiences are important for communication and community building.
My biggest concern about AI isn't the technology itself, it's the ignorance of many of the people using it. Any time a journalist reports on some dumb or hallucinated result they got with AI, I want to scream, "You don't know how to use it!"
DeleteThat TNG episode came to mind when I mentioned Graeco-Roman myth/history. The Tamarian memes were exactly the equivalent of how we might refer to "Horatio on the bridge", etc, when I was a kid. But even then (the late '60s) it wasn't universal. An American would be far more likely to use Biblical analogies which meant nothing to me.