AI Dungeon, a text-based fantasy simulation that runs on OpenAI’s GPT-3, has been churning out weird tales since May 2019. Reminiscent of early text adventure games like Colossal Cave Adventure, you get to choose from a roster of formulaic settings—fantasy, mystery, apocalyptic, cyberpunk, zombies—before picking a character class and name, and generating a story.
Here was mine: “You are Mr. Magoo, a survivor trying to survive in a post-apocalyptic world by scavenging among the ruins of what is left. You have a backpack and a canteen. You haven’t eaten in two days, so you’re desperately searching for food.” So began Magoo’s 300-ish-word tale of woe in which, “driven half-mad” by starvation, he happens upon “a man dressed in white.” (Jesus? Gordon Ramsay?) Offering him a greeting kiss, Magoo is stabbed in the neck.
As lame as this story is, it hints at a knotty copyright issue the games industry is only just beginning to unravel. I’ve created a story using my imagination—but to do that I’ve used an AI helper. So who wrote the tale? And who gets paid for the work?
AI Dungeon was created by Nick Walton, a former researcher at a deep learning lab at Brigham Young University in Utah who is now the CEO of Latitude, a company that bills itself as “the future of AI-generated games.” AI Dungeon is certainly not a mainstream title, though it has still attracted millions of players. As Magoo’s tale shows, the player propels the story with action, dialog, and descriptions; AI Dungeon reacts with text, like a dungeon master—or a kind of fantasy improv.
In several years of experimentation with the tool, people have generated far more compelling D&D-esque narratives than mine, as well as videos like “I broke the AI in AI Dungeon with my horrible writing.” It’s also conjured controversy, notably when users began prompting it to make sexually explicit content involving children. And as AI Dungeon—and tools like it—evolve, they will raise more difficult questions about authorship, ownership, and copyright.
Many games give you toolsets to create worlds. Classic series like Halo or Age of Empires include sophisticated mapmakers; Minecraft precipitated an open-ended, imaginative form of gameplay that The Legend of Zelda: Tears of the Kingdom’s Fuse and Ultrahand capabilities draw clear inspiration from; others, like Dreams or Roblox, are less games than platforms for players to make more games.
Historically, claims of ownership to in-game creations or user-generated creations (IGCs or UGCs) have been rendered moot by “take it or leave it” end-user license agreements—the dreaded EULAs that nobody reads. Generally, this means players surrender any ownership of their creations by switching on the game. (Minecraft is a rare exception here. Its EULA has long afforded players ownership of their IGCs, with relatively few community freakouts.)
AI adds new complexities. Laws in both the US and the UK stipulate that, when it comes to copyright, only humans can claim authorship. So for a game like AI Dungeon, where the platform allows a player to, essentially, “write” a narrative with the help of a chatbot, claims of ownership can get murky: Who owns the output? The company that developed the AI, or the user?
“There’s a big discussion nowadays, with prompt engineering in particular, about the extent to which you as a player imprint your personality and your free and creative choices,” says Alina Trapova, a law professor at University College London who specializes in AI and copyright and has authored several papers on AI Dungeon’s copyright problems. Right now, this gray area is circumvented with an EULA. AI Dungeon’s is particularly vague. It states that users can use content they create “pretty much however they want.” When I emailed Latitude to ask whether I could turn my Mr. Magoo nightmare into a play, book, or film, the support line quickly responded, “Yes, you have complete ownership of any content you created using AI Dungeon.”
Yet games like AI Dungeon (and games people have made with ChatGPT, such as Love in the Classroom) are built on models that have scraped human creativity in order to generate their own content. Fanfic writers are finding their ideas in writing tools like Sudowrite, which uses OpenAI’s GPT-3, the precursor to GPT-4.
Things get even more complicated if someone pays the $9.99 per month required to incorporate Stable Diffusion, the text-to-image generator, which can conjure accompanying pictures in their AI Dungeon stories. Stability AI, the company behind Stable Diffusion, has been hit with lawsuits from visual artists and media company Getty Images.
As generative AI systems grow, the term “plagiarism machines” is beginning to catch on. It’s possible that players of a game using GPT-3 or Stable Diffusion could be making things, in-game, that pull from the work of other people. Latitude’s position appears to be much like Stability AI’s: What the tool produces does not infringe copyright, so the user is the owner of what comes out of it. (Latitude did not respond to questions about these concerns.)
People can’t currently share image-driven stories with AI Dungeon’s story-sharing feature—but the feature offers a window into a future where game developers may start using or allow players to use third-party AI tools to generate in-game maps or NPC dialog. One outcome not being considered, says Trapova, is that the data of these tools might be drawn from across the creative industries. This “raises the stakes,” she argues, growing the number of possible infringements and litigious parties. (Stability AI and OpenAI did not respond to queries about this point.)
Some platforms have adopted a more cautious approach. In March, Roblox rolled out two new tools in Roblox Studio, the program players use to build games. One, a code completion tool called Code Assist, automatically suggests lines of code. The other, Material Generator, allows players to create graphics from prompts like “bright red rock canyon” and “brand new wood flooring.”
Both of these tools use generative AI, but have been trained on assets that have been released for re-use by Roblox’s community, and not on games created by the community. “Every creator on the platform can leverage these tools without sharing their data,” says Stefano Corazza, head of Roblox Studio. AI Dungeon, by comparison, is pulling out images and ideas from god-knows-where.
That caution in regards to training data is important because player permission will be the critical issue going forward. Corazza admits that some of the Roblox community bridle at the idea that their work will train AI. They see their code as their “secret sauce,” he says, and assume that rivals will be able to harvest it to recreate their game. (While, as Corazza points out, that isn’t how these tools work, this worry is extremely understandable.)
He suggests that Roblox is looking at an opt-in “system” for allowing user data to train AI, though the company hasn’t made any final decisions. “Roblox Studio has made it clear that we will provide a mechanism so that creators can manage the use of their data for generative AI training,” says Corazza. “If and as our approach evolves, we will be transparent with creators.”
The situation could quickly change if Roblox and companies like it decide they need more of your data. Roblox’s EULA (under the section titled “rights and ownership of UGC”) makes clear that its community doesn’t have the same rights as someone who just builds their own game from scratch. Were the company to change its mind, there is very little, legally, the community could do; Corazza counters that if Roblox acts tyrannically, the community will protest. “I think the legal aspect is less important. It’s more important to respect the community,” he says.
Integration with third-party tools brings the same potential problems faced by AI Dungeon. Roblox and Stanford University have already collaborated to create ControlNet, a tool that gives artists deeper control over large diffusion models like Stable Diffusion. (Redditors used the tool to produce a series of impressive QR code anime figures.) “Although we cannot verify the provenance of every asset that our creators upload to the platform, we have a unique and very robust moderation system to make sure the content is compliant,” says Corozza.
Trapova suggests the game development industry is on the brink of a generative AI reckoning. “They look super cool,” she says of game development tools like AI Dungeon. “But this just gives you a flavor of the issues that we will end up having if this all goes on steroids.” Soon, such legal problems will become impossible to ignore.