Can writing an article about next-gen AI storytelling be an adventure?
We decided to find out by asking the video game AI Dungeon, fueled by today’s most highly advanced content creation platform, to co-write an article about … itself. As you can see from the resulting collaboration below, which includes our interview with the game’s creator, the game has quite a lot to say.
While the game offers players several traditional genre options—fantasy, sci-fi, cyberpunk, etc.—when starting a new campaign, we chose the “custom” experience and entered the text below to help the game’s AI set the stage:
You are an Adweek reporter writing about a text adventure role-playing game called AI Dungeon, which is powered by cutting-edge language generation machine learning.
The software engine behind the game was recently upgraded to research group OpenAI’s latest copy-generating model called GPT-3, trained on upwards of 100 times more data than the previous version, GPT-2. Released last month, the system was fed around a trillion words and cost $12 million to develop.
(Italicized text below was written by the game’s AI, which then responded to each subsequent paragraph in the article.)
The project has been controversial from day one. Some people believe that it will lead to a totalitarian future where artificial intelligence can be used for evil purposes like controlling the world or even taking over the planet.
You finally manage to get the AI to write an at least vaguely true statement to justify this gimmicky article format after several tries. GPT-2 was in fact controversial upon its release last year. OpenAI opted to delay the full version for fear that its ability to generate realistic-sounding copy would be used for large-scale production of fake news and spam. AI Dungeon is one of the few commercial applications to emerge from GPT-2, and it recently upgraded to GPT-3.
“GPT-3 is capable of generating any sentence you want,” says the AI. “But I’m not sure if it’s going to be able to produce something as good as a real story.”
The AI has arrived on a key question about this new tech: Is it capable of supplementing creativity or pointing toward a future for automated interactive storytelling?
You write a headline.
You try to think of a good, clickbait-y headline that will make people want to read this story.
You finally think of one:
“DARK FUTURE: AI’s Use in Story Creation Raises Eyebrows”
You interview AI Dungeon co-creator Nick Walton.
You call Nick to try to get a better understanding of the AI’s capabilities and how it’s being used in the project. “How can I help you?” asks Nick.
You say, “What’s your overall impression of GPT-3 so far? How does it compare to GPT-2?”
“It’s just much more coherent in terms of understanding who the characters are, what they’re saying, what’s going on in the story and just being able to write an interesting and believable story,” Walton said. “So I think we definitely see an increase in the ability of the AI to think of realistic interactions with other characters.”
The difference between GPT-2 and GPT-3 is huge, but it’s not really that surprising.
It’s not exactly fair to use the game to write this article because its strong suit is fantasy, though it offers other genres including mystery and apocalyptic. (This story was generated with the custom premise option). It’s trained on text adventure games scraped from chooseyourstory.com.
Most of the more typical quests generated by AI Dungeon involve missions like tracking down mystical spell books, defeating dragons and solving murders.
But because the system on which it’s built has been trained on millions of webpages, it can also understand references outside of that realm as well. GPT-3 is so powerful that it hardly needs any extra training—OpenAI claims that it can imitate a style of writing after seeing only a few example passages.
But this type of research is bound to cause controversy. It could potentially be used to create fake news, not to mention hiring freelance writers might become a thing of the past.
GPT-3 is far from replacing human writers, though. In fact, it’s seen little commercial adoption outside of this game and some experiments with chatbots and web copywriting. Part of that might have to do with its inherent unpredictability posing brand safety risks. OpenAI, a hybrid nonprofit/for-profit organization, has been attempting to change this with its first commercial product: An API based largely on GPT-3.
“The No. 1 issue is just capability,” Walton said. “We’re a lot more fault-tolerant than potential business applications. Because if AI says something wrong or crazy, it’s fine because it’s just a game. Whereas if you have some business application dependent on it, you could potentially be in a more difficult situation.”
And even if you’re not worried about whether or not it’ll say something problematic, you might not be able to understand what the hell it’s saying.
“You really need to just experiment and try out a lot of things—it’s kind of like playing an instrument, but it’s a new instrument that was just invented,” Walton said. “So there’s no experts, there’s no master classes on how to play this instrument. The people that will get the best at it are people who practice and try a lot of interesting new things.”