Skip to main content

How generative AI will create games with ‘broader, bigger, and deeper worlds’

Of all the use cases for generative AI out there, I can’t think of one more significant than video games. Sure, we’ve seen people create simple games from GPT-4 — but surely, I assumed a technology this powerful was being discussed at the higher levels of game development too.

To get an idea of how big of a shift this could be, I wanted to talk to someone who actually understands how games are made on a technical level. Marc Whitten, Senior Vice President and General Manager of Unity Create, is surely one such person. He’s particularly excited about how AI could transform game development, and we spoke about how the tools that could enable that revolution are already making their way to creators.

Recommended Videos

Faster create time

Ziva Face Trainer in Unity.
Image used with permission by copyright holder

Games take a huge amount of time and effort to develop, but most of that time is dedicated to creating all of the content for the game. Whitten says that if you look at a common 300-person AAA studio, somewhere around 80% of them are dedicated to creating content. AI can speed up that process drastically.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Whitten provided a clear example of that: Ziva Face Trainer. Ziva is a company Unity acquired in early 2022, and it has been working on its Face Trainer tool for a little over two years. It takes a model, trains it on a large set of emotions and movements, and generates something usable.

How much time does this save? Whitten says high-end rigging of a character can take a team of four to six artists four to six months: “Candidly, [that’s] why the state-of-the-art quality of characters has not actually progressed that much in the last ten years or so.”

Senua’s Saga: Hellblade II – The Game Awards 2019 – Announce Trailer (In-Engine)

With Ziva Face Trainer, developers “give it a mesh and we train that mesh against a large set of data… so you get back in five minutes a rig model that allows you to then run it in real-time.” Ziva tech is being used a lot, too. It’s behind the suit deformation in Spider-Man: Miles Morales, as well as the Troll in the Senua’s Saga: Hellblade 2 trailer. You’ve probably seen it in a few movies and TV shows even — Captain Marvel, John Wick 3, and Game of Thrones are on the list.

That shouldn’t come as a surprise. Machine learning and procedural techniques (such as tools like SpeedTree) aren’t exactly new in the world of game development. It’s true that more research into AI models can lead to even more efficient creation pipelines, but we’re seeing a shift with generative AI. We’re talking about large language models (LLMs) like GPT-4 and diffusion models like Midjourney, and they can radically change the games we see.

Changing the game

Image used with permission by copyright holder

Whitten says the hope with AI is to make games “ten to the third better,” which means games that are ten times faster, ten times easier, and ten times cheaper to develop. The result of that isn’t a flood of the same games we have, though. Whitten believes the results of that are “broader, bigger, deeper worlds.”

I asked for an example, and Whitten pondered what Skyrim would look like if it had a generative AI model behind it. We’ve all heard the “arrow to the knee” meme from the game, but Whitten imagined a game where that throwaway line meant something more.

“Well, what if each of those guards actually had a Myers-Briggs-type chart? A little bit of a backstory and frankly, a backstory that could have been impacted by that. What has happened with the character along the way? And then an AI model to generate what would be a rational response coming out of that, given all of those particular events.”

We’re seeing some effort there with games like The Portopia Serial Murder Case, which, bluntly, haven’t made the best case for AI in games. It’s not hard to see the potential, though, especially in larger games with NPCs that don’t have set quests or exhaustive dialogue.

A player converses with an NPC in The Portopia Serial Murder Case.
Image used with permission by copyright holder

There’s a lot of potential in sandbox-style games, as well. Whitten imagined a GTA-style game where you “go into the pawn shop and recruit the person behind the behind the desk and, you know, with maybe the game creator never even thinking about that as a possibility because of something else that happened in the game.” Whitten also thought about Scribblenauts, except in a world where you could truly make anything and assign it any properties.

The problem right now is getting that to actually work, as evidenced by The Portopia Serial Murder Case. Whitten was one of the founding members of the Xbox team at Microsoft, and he helped lead the Kinect push. About Kinect, Whitten said: “I would tell everybody it works amazing if I’m sitting next to you.” You needed to prompt it in a specific way, and if you deviated, it wouldn’t work.

That’s the big problem that’s faced AI as a whole, with smart assistants like Alexa only operating within a narrow range. LLMs change that dynamic and allow for any prompt, and that’s what’s exciting about creating deeper game worlds. There’s still a road to get there, though.

“If you put the tool out there … [creators will] hit whatever the boundaries are and say, ‘Well, that’s not fun.’ But then they’re going to actually go find the space that no one’s even thinking about,” Whitten said.

With more tools rolling out, we could see some early experiments with AI within the next year. We already have in some cases, such as the wildly popular AI Dungeon 2. But to make this sort of immersive world possible at scale, you need a middleman. And for Unity, that middleman is Barracuda.

The Barracuda

An image from Unity's Book of the Dead.
Image used with permission by copyright holder

Unity includes a neural network inference library called Barracuda. As Whitten explains, “It’s an inference engine that allows you to drive either diffusion or other forms of generative content at runtime on the device without hitting the cloud and at a highly performant pace.”

Oh yeah, performance. As much as we like to talk about AI can change content forever, there’s a massive computational cost (there’s a reason it took tens of thousands of GPUs to build ChatGPT). Barracuda allows those models to run on your CPU or GPU so you don’t have to go out to the cloud, which, for the record, would be a huge money-sink for developers.

Unity is working on more features for Barracuda, and Whitten says the “interest back from the game creator community has been extraordinarily high.” It’s the key that makes generative AI possible in game development and design, especially without requiring any specific hardware.

Whitten says the team wants to start “building techniques that allow creators to start really targeting a large and core part of their game design, not ‘Oh, this is going to really diminish my audience if I design for it.'” Unreal Engine, for its part, has a similar tool (the aptly-named NeuralNetworkInference tool, or NNI).

These libraries, when met with large generative AI models and an acceleration in content development, can lead to an “explosion of creativity,” according to Whitten. And that’s something to get excited about for the future of games.

This article is part of ReSpec – an ongoing biweekly column that includes discussions, advice, and in-depth reporting on the tech behind PC gaming.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Don’t waste your money on an OLED gaming monitor
Cyberpunk 2077 running on an OLED gaming monitor.

OLED monitors are a waste of money, which is a shocking statement coming from the guy who packed the list of the best gaming monitors with OLED options. I love an OLED gaming monitor and the experience it can provide, don't misunderstand me. For the vast majority of gamers, however, they're simply too expensive to justify right now.

Since the release of the LG UltraGear OLED 27, we've seen a flurry of new gaming monitors sporting the latest and greatest panel tech. Each new release pushes the envelope further. Although we're a few years separated from when the first OLED gaming monitors showed up, this display tech is still in its early adopter stage.
You're literally wasting money

Read more
Intel’s downfall hurts everyone
intel downfall hurts everyone dt respec

Intel can't catch a break. The instability saga we've witnessed over the past few months, along with a historically disastrous earnings report for the company, has led Intel into some pretty bleak territory. The company even postponed its Innovation event this year; meanwhile, shareholders are filing a lawsuit against Team Blue. No matter how you slice it, Intel is having a bad time right now.

There's a lot of understandable anger pointed Intel's way right now, from game developers saying they're going to lose money, to gamers who say they've been denied returns, to shareholders that claim Intel fraudulently hid how bad things were. Those are all legitimate things to be frustrated about, and I understand the satisfaction you might feel when a company gets what it deserves.

Read more
I grilled Intel about its massive stability problem — here’s what it told me
intel instability unanswered questions dt respec

Intel is in trouble. We've known there was a problem for months, but the true scope of the issue is coming into focus. Intel has finally said a microcode update that will solve the instability problem is on the way, but it won't be here for several weeks. It's not much of a resolution, either -- I still have a lot of questions about Intel's instability problem, and how it plans to address the issue going forward.

I've sent the list of questions below to Intel for a response, and in cases where Intel has responded, I'll provide the exact quote. We've definitely seen some shifty communication from Intel regarding the instability issue up to this point, so I'll fill in the gaps if there's anywhere Intel wasn't able to provide a solid answer.
How will this impact performance?

Read more