Virtual production set powered by Unreal Engine.

7 Ways Video Games Are Changing How We Make Movies

In its day, the 1982 film Tron was the state of the art in digital storytelling.

The film introduced audiences to a stunning new kind of cinematic visuals, unlike basically anything else in theaters or on TV. And perhaps more importantly for modern filmmakers, Tron’s creators pioneered CG animation technology that paved the way for many cutting-edge VFX tools.

Yet this spectacular visual style, and its subsequent impact on the industry, was inspired by one of the earliest (and simplest) commercial video games, Pong.

To audiences and filmmakers alike, Tron looked like a video game, but in a totally new context. It was on a (much) bigger screen, the graphics were breathtaking, and it told a sophisticated story.

In that era, video game creators could only envy the superior visuals and storytelling tools of their cinematic counterparts. So the use of video game-style graphics/technology for a feature film was novel and exciting.

Flash forward nearly 40 years, and the tables have turned.

Video games have now come into their own as a storytelling medium. Modern games boast astoundingly realistic graphics, complex open worlds, and emotionally nuanced narratives that rival the best of Hollywood.

While Tron once imagined new ways of visualizing a computer game, it’s today’s games that are pushing the technical and creative innovations of how we make movies.

So let’s look at 7 tools and trends from the video game industry that are having a profound impact on how filmmakers work and stay creative.

Ready, player?

First, let’s put things in context.

As of 2020, video games are a $180 billion business. That dwarfs the $100 billion global film industry.

Game developers produce a huge amount of content and fund an incredible amount of R&D, and the titles they produce define many emerging entertainment trends. A big chunk of that $180 billion goes to developing innovative tools to build even more games. But these tools are also benefiting other industries, like film and television.

Perhaps the most important point here is that the game industry can massively outspend the film industry when it comes to developing new storytelling tools. And that opens up a huge opportunity for them to grow and adapt.

Take Epic Games’ Fortnite as an example. This game renders expansive, action-packed, 3D environments in real time using Epic’s powerful Unreal Engine.

But Fortnite is more than just a video game. It’s a soundstage.

In replay mode, Fortnite lets gamers precisely control virtual cameras that can dart in and out of the action or float above the fray, capturing every victory or defeat in surprising clarity.

Here’s another thing about Fortnite. It’s insanely profitable. Despite being completely free-to-play, it earned $9 billion in revenue in its first two years through in-game purchases. And that’s good for filmmakers.

Why? Because Fortnite is such a cash cow, Epic can afford to put massive resources behind the development of Unreal.

This is great news for the film industry, because Unreal has evolved into a hugely powerful tool for modern video workflows.

1. The new soundstage

In recent years, many high-end VFX visualization pioneers have moved away from traditional 3D applications, like Autodesk Maya and 3ds Max, and started using Unreal Engine.

The Mandalorian went a step further by using Unreal for “in-camera VFX.” That means ILM used Unreal to render background environments in real time for display on giant LED walls around the soundstage. That allows scenes to have incredible CG environments with almost no post-production required.

Epic is now actively encouraging Hollywood’s use of Unreal by adding new features specifically designed for filmmakers.

For example, the Virtual Camera plug-in lets you use an iPad Pro as the viewfinder for an imaginary camera. Position and orientation are tracked as the user moves around the space and then imported into the 3D scene.

“Epic is taking the success of Fortnite and putting that money back into the engine,” says cinematographer and technologist Jim Geduldick, SVP of Virtual Production at Dimension North America, a branch of London-based Dimension Studio. “Filmmakers are looking at these toolsets and thinking, ‘I used to need a team to do that, but now I can do my shot blocking in the game engine.’”

Of course, Unreal isn’t the only game in town. The Unity engine offers many of the same capabilities as Unreal, but with a focus on ease of use. Just like Unreal, Unity is in use for a wide variety of productions.

Other game engine-like environments, such as Notch and Derivative’s TouchDesigner, are also helping bridge the gap between real and digital worlds. These tools enable real time set mapping for immersive live production, like Amazon’s 9-part BTS miniseries Inside The Boys.

The team was able to track multiple live cameras with virtual 3D backdrops, and then added augmented reality elements in real time. That delivered a final shot, in-camera, without the resource and time cost of traditional VFX post-production tools.

And this technology is spreading like wildfire through scripted programs too, including Station 19, 1899, the new season of Star Trek: Discovery, and the upcoming Taika Waititi-helmed comedy Our Flag Means Death.

All this to say, game engines are quickly becoming one of the most important filmmaking tools for creative innovation and improving workflow efficiency.

2. Pushing pixels

But why are filmmakers only realizing the benefits of game engines now?

Unreal Engine debuted in 1998, and filmmakers have been experimenting with some variety of real time virtual production since at least 2006. That’s when Weta Digital developed the virtual camera tech used in James Cameron’s Avatar.

Why is it only in the last couple of years that virtual production techniques have really taken hold in Hollywood?

The answer is the GPU (graphics processing unit). 

It’s one thing for Weta to push around enough polygons for Cameron to do scene blocking in a simple virtual environment. It’s another thing to do what ILM did on The Mandalorian in real time, and requires much more capable software and hardware.

Fortunately for filmmakers, most of whom don’t have a legendary VFX company in their basement, the demand for video games has commoditized graphics processing.

On the high end, filmmakers can spin up monstrously powerful GPU servers in the cloud as they need it, all without needing to invest in hard infrastructure. 

More affordably, for users who need to edit 4K footage (and higher) or manage a DIY VFX workflow, there are a huge variety of powerful GPUs that cost just a fraction of the cameras we shoot with.

To explore the power of today’s graphics cards, let’s look at the workflow behind Pixar’s first feature film, Toy Story.

Pixar used a cluster of 117 Sun workstations (87 of them dual-processor, 30 of them quad-processor) to render the film. But with mid-1990’s technology, it took over 800,000 machine hours to render the final cut of Toy Story.

Mark VandeWettering, who was a Pixar software engineer at the time of the film, says that today’s GPUs would make short work of Toy Story. “It’s pretty clear that modern video games are well beyond what we could achieve back in 1995,” he says.

Of course, our standards for computer graphics and animation have progressed a lot since Toy Story. But the point stands that today’s GPUs wield more compute power than the entire workflows behind some of Hollywood’s greatest 3D animated films.

If those teams had access to modern hardware to let them render and iterate faster, imagine how many more animated masterpieces could have been made with the same resources and talent. Or imagine all of the new creative techniques they could have experimented with.

The levels of computer performance that were previously a high mark for supercomputers are now found in basically every video game console.

Fortunately, we don’t have to imagine. We now live in that age, where filmmakers command nigh-unimaginable computational resources. The levels of computer performance that were previously a high mark for supercomputers are now found in basically every video game console. And for an absurdly low price.

GPUs have become fundamental to many industries, but especially filmmaking.

From previs and virtual production, all the way through to editing, compositing, and visual effects, GPU performance enables previously unattainable levels of creativity and efficiency.

Graphics hardware has become so powerful, in fact, that filmmakers would be crazy not to take advantage of this new tool—even if it means rewriting the traditional rules of production.

3. Merging worlds

And those rules of production are definitely being rewritten thanks to two closely-related staples of video game development: volumetric capture and photogrammetry.

Essentially, these techniques produce more than just the flat, 2D image data of a typical still or video camera. They use depth-sensing cameras or camera arrays to extrapolate 3D models of an object or scene. Those models can then be used in a virtual environment or post-production.

If this sounds like science fiction, you may be surprise that photogrammetric tools are already transforming real-world workflows.

For many productions, simply shooting beautiful footage is no longer enough. Modern VFX teams need more (and better) information to mesh their digital creations into the real-world scene that was captured by the camera.

Because much of that work begins long before cameras start rolling, it’s no longer a linear flow from pre-production to production to post-production. And photogrammetry and volumetric capture are helping meet these workflow needs.

Outside of feature films, sports broadcasters have been among the earliest adopters of these techniques. Over the past several years, they’ve been implementing the required infrastructure into sports venues, which allows them to capture the entire playing field as volumetric and photogrammetric data.

The live footage travels via fiber connection to servers that crunch the images into volumetric data that production teams can use to generate 360-degree instant replays, virtual camera views, or even first-person player perspectives.

It’s important to note that, unlike game engines, volumetric capture doesn’t usually yield a 3D image in real time. Intel acknowledges that it takes about 30 seconds to create a single volumetric frame, even with beefy servers located on-site.

And while volumetric capture isn’t necessarily photorealistic, the unique, sometimes-glitchy appearance of volumetric renders has intrigued sone filmmakers.

In 2016, NYC production studio Scatter collaborated with documentary filmmaker Alex Gibney on Zero Days VR. It used volumetric video to portray an NSA whistleblower as a disembodied head glitching between a ghostly holographic look and photorealism.

For filmmakers who want to experiment with small-scale volumetric capture, Scatter sells Depthkit software that works with devices like Microsoft’s Azure Kinect and Intel’s line of RealSense depth cameras.

Writer-director Neill Blomkamp (District 9, Elysium) has embraced volumetric capture on a larger scale. His new film Demonic, set for release on August 20, includes more than 15 minutes of volumetric footage.

For scenes where the film’s main character explores a simulation of the brain of her comatose mother, actors performed inside a 260-camera volumetric rig. Those volumetric captures were composited into 3D environments using the Unity engine and a new, patent-pending technology code-named Project Inplay, which is designed for real time playback, rendering, and even dynamic relighting of large volumetric point clouds.

The film’s trailer includes glimpses of the simulation, which seems more unsettling than photorealistic—appropriate for a horror film.

By the way, Blomkamp borrowed one more crucial element from video games: he commissioned a score by Ola Strandh, a composer best known for his electronics-heavy music for the Tom Clancy’s The Division video game franchise.

These sorts of high-profile projects that use volumetric and photogrammetric effects are increasing demand for the tools, and that’s attracting the attention of many major companies. That means more investment in these tools, as tech companies try to capture some of the demand for volumetric capture and interest in virtual production.

Dimension Studio, for instance, has partnered with Avatar Studios and Sabey Data Centers to open Avatar Dimension, a new volumetric capture studio in Ashburn, VA. There it plans to specialize in “virtual experiences for enterprise.” That’s just one of five Microsoft-certified Mixed Reality Capture Studios that are currently being built.

But volumetric capture and photogrammetry are not limited to exotic, enterprise-level workflows. Productions of every scale can already start taking advantage of these new tools.

For example, Apple debuted Object Capture at WWDC 2021. This new API will allow developers to turn basically any Mac or iPhone into a tool for photogrammetric capture.

As the power and affordability of GPUs continues to grow, we can only expect these sorts of developments will continue. And that will have a profound impact on how we tackle production and post-production.

4. A new way to play

One way this impact is being felt is where and how pre-production teams work.

Many assume that virtual production is still limited to LED stages and complex volumetric capture rigs. But as a tool for efficient pre-production, previs, and techvis, it can work at almost any budget.

For example, take a look at CineTracer, a real time cinematography simulator. This $90 app uses Unreal Engine to let you work out scene lighting, shot blocking, and storyboards all inside what is essentially a video game. And you literally buy it on the video game marketplace Steam.

According to CineTracer’s creator, Matt Workman, these sorts of tools will be the bridge for many into virtual production. Workflows will evolve to include these kinds of software tools, and then as more affordable LED stages open up in major shooting markets, we’ll start to see turnkey services being offered to filmmakers. That will provide an easy transition into larger-scale virtual production tools for many teams.

“You’re not going to approach a big-budget TV series that way,” he says. “But if it’s just a case where, for example, you need to be on the moon, you’ll be able to source affordable CG backgrounds and show up at one of these stages and shoot it. It’s not like you set up your own soundstage to go make a commercial. You rent it. Virtual production will be similar.”

“If you’re on a small team and you want to make a film using these technologies, you can do that without ever setting foot on an LED volume. Use it to do all of your VFX, blocking, storyboarding and previs, then go off and shoot traditionally.”

You don’t even have to sit at your workstation to use Ghostwheel’s Previs Pro, an iOS app that creates storyboards from virtual cameras, lights, characters and props in 3D environments. It even has an Augmented Reality mode to help you visualize your scene in a real space.

Obviously, these powerful and affordable tools open up a lot of completely new opportunities for video professionals. But some innovations that have come out of the world of games have been a bit more invisible.

5. Taking control

Think about the first video game you ever played. How did you play it? What was the controller like?

For many of us, our first games were endured on the awkward controllers of legacy consoles, like the hard, sharp corners of the Nintendo Entertainment System’s square controller. Or on the tacky keyboards and slow ball mice of the 1990s.

But since then, video game developers and manufacturers have spent decades and untold millions to nearly perfect the design of game controllers.

Why? Because the human-machine interface is a fundamental consideration of video game design. Without an adequate way to control what’s happening on screen, games are not fun, engaging, or even playable, which of course makes them unprofitable.

That’s why today we have incredibly sensitive dual-stick control schemes, haptic feedback, adaptive trigger technology, and a lot more. All in the name of giving gamers hair-trigger control of the on-screen action.

But these controllers are useful for a lot more than games. Similar input devices are routinely used by camera operators to intuitively control cameras, gimbals, drones, and other production hardware.

Some enterprising video editors have even reprogrammed control pads aimed at professional gamers to work with post-production software, by mapping certain keyboard shortcuts to buttons, triggers, and dials.

Even if you prefer the traditional keyboard and mouse setup, the game industry has developed many fantastic input devices that video pros can take advantage of. After all, sitting at a computer playing a game and sitting at a computer editing a video both require similar ergonomic considerations. 

But let’s go back to the handheld controllers.

If you think about it, a video game controller is kind of like a puppeteering device. They allow you to make on-screen characters run, jump, crouch, shoot, and talk.

So then it makes a lot of sense why The Jim Henson Company is interested in transferring their legendary skills at manipulating physical puppets to the digital realm. The company’s puppeteers can now use their award-winning puppet performance control systems to manipulate a digital character.

The result is BETI, a CG character controlled and rendered in real time on set of the new show Earth to Ned.

Of course, they’re not using typical Xbox or Playstation controllers, but many of the hardware and software lessons learned from the game industry are being incorporated into these fascinating control devices. And the same is true for a variety of new and exciting production tools. 

6. Open worlds

To really understand how video games are transforming the larger entertainment landscape, we need to take a step back from technology.

Think how games have influenced, or even fundamentally shifted, popular culture, content distribution, and profit models for content creators.

Let’s start here: games aren’t just games anymore. They represent communities.

Some of the earliest exampled were MMO (massively multiplayer online) games. They were, in essence, the first interactive social networks. 

And these trends have continued with Fortnite. It isn’t just a multiplayer battle royale game. It’s an interactive space where friends and like-minded strangers congregate online, socializing over their headset mics in between intense battlefield skirmishes. Young people hate to talk on the phone, but they’ll chat for hours while playing online.

Players even consume other types of media while they’re in-game.

In May 2020, Fortnite players were treated to the world premiere of the trailer for director Christopher Nolan’s Tenet. In June 2020, Fortnite players spent an aggregate of more than 12 million hours at a virtual Travis Scott concert.

In a rather telling letter to shareholders, Netflix CEO Reed Hastings said he considers Fortnite a more formidable competitor than HBO.

But video gamers aren’t just passively consuming media. They’re creating it, too. And the savviest ones are making money. A lot of money.

These creators are hardly working in a vacuum. The most talented among them can share their work way beyond their friend circle, with an array of distribution options—think TikTok, YouTube, Vimeo, and Twitch—all of them with their own strengths, weaknesses, and idiosyncrasies. And none of them with gatekeepers or any other real barriers to entry. 

Further, live streams on YouTube, Twitch and Instagram encourage interactivity, solving a problem that traditional TV has never managed to tackle. Users are now empowered to ask questions, crack jokes, and even throw cash directly at creators in real time by subscribing to their Twitch channels.

How much cash? Fortnite champ Ninja has 24.2 million subscribers on YouTube and 16.7 million followers on Twitch and at one point claimed to have earned $500,000 a month as a full-time pro gamer. Ninja became so famous so fast that Microsoft paid him somewhere between $30 and $50 million to leave Twitch and join the company’s ill-fated Mixer platform.

A year later, Mixer collapsed and an even wealthier Ninja returned to Twitch. He appeared on the cover of The Hollywood Reporter, guested on Jimmy Fallon and was featured at Lollapalooza. Call him Twitch’s first crossover success story.

There’s demand for pre-recorded content, too: YouTube viewers watched 100 billion hours of gaming content in 2020, with video related to Microsoft’s megahit Minecraft alone earning 201 billion views.

And this isn’t just talking heads on webcams and a bad green screen. Check out the level of craft and detail in something like Worlds Apart, a 45-minute long movie produced by Black Plasma Studios entirely inside Minecraft. It has attracted more than 24 million views.

If Twitch streaming and Minecraft movies seem like niche interests compared to the shows on HBO Max and Disney+, consider that independent creators on social media are definitely getting paid. Even many small YouTubers make the same or more money than many independent filmmakers.

So if video game creators are eschewing traditional content distribution models, and still making bank, what does that mean for the film and TV industry?

7. Skip the tutorial

It’s clear that video game technology is having a profound effect on film and TV production. New tools allow for more immersive worlds and realistic characters, and unlock tremendous efficiencies in our workflows.

But how will the ideas and skills learned from video games influence the next generation of filmmakers?

We’ve already discussed some way gamers are creating video content, either leveraging the games themselves as storytelling tools or showcasing themselves as gaming personalities. But more than that, games have become an incredible tool for teaching the crafts of filmmaking

With virtual cameras integrated into the most popular franchises—Fortnite, Minecraft, Roblox, Madden NFL, NBA 2K, and countless more—video games are teaching millions of young people not just visual storytelling but interactive digital production.

On top of that, games are equipping them with the tools to tell their own stories and experiment with the craft.

Video games are teaching millions of young people not just visual storytelling but interactive digital production.

It’s no wonder gamers have driven demand for software that can make even a modest desktop gaming rig a capable streaming machine. Twitch streaming is already built into consoles like the Xbox and PS5, but free, open-source software like OBS Studio empowers PC users to stream video straight out of a video game viewport—or to capture it for later editing—and even key out a green-screen background when they appear on camera. 

While there are plenty of free and low-cost prosumer editing applications, the most ambitious gamers are teaching themselves to use very capable NLEs like Final Cut Pro, Adobe Premiere Pro, and DaVinci Resolve.

So what happens in ten years or so, when today’s tweens and teens enter the media industry having grown up with easy, consistent access to interactive digital narratives and storytelling tools?

If anything has the potential to shake up the notoriously conservative film business, it’s a whole new generation of media-savvy creatives with an intuitive understanding of shot framing, action choreography, and editorial techniques and a decided lack of reverence for established styles and genres.

Matt Workman points out that young people are already pioneering a new, playful style of real time entertainment that hasn’t really crossed over to linear media yet.

Just consider the emerging field of “virtual influencers” or “Vtubers,” social-media celebrities who present themselves to their followers as 3D CG avatars.

One of the most prominent is CodeMiko. She’s a virtual character performed in real time by an L.A.-based animator and coder. The creator invested $20,000 in mocap hardware and software after being laid off from a studio job during COVID lockdown.

With her natural command of real time mocap and animation techniques—and a talent for instigating real time social interactions with other popular streamers—CodeMiko has grown into a full-fledged business employing five developers, a management firm, a publicist, and 750,000 Twitch Subscribers.

CodeMiko and Ninja are two of the first Twitch creators to cross over to the mainstream, but they certainly won’t be anywhere near the last. 

That’s not the only reason to keep an eye on Twitch. Reporting on recent Twitch trends, Nerd Street Gamers argues that Twitch is quickly becoming a truly international platform, with Spanish-speaking streamers now commanding the largest numbers of concurrent viewers.

As Hollywood continues to struggle with diversity problems, it’s interesting to think about how the democratic fundamentals of streaming could create new opportunities for culturally diverse and marginalized voices.

The next level

In the Goteborg Film Festival’s 2021 Nostradamus Report, media analyst Johanna Koljonen predicts that virtual production will be the norm across the industry by 2026.

The idea of delivering a fully 3D-animated show without a post-production component once amounted to just magical thinking. Today, it seems almost within reach, thanks in part to the flourishing world of video games. Demand for games is only growing, which will require more investment, new tools, and greater creativity.

So filmmakers should pay attention.

One impending development is Epic’s Unreal Engine 5, which is coming later this year. It will bring new features that make real time virtual production workflows better, faster, and cheaper. And this is just one of many exciting tools on the horizon.

Best of all, while the technology may take time and practice to master, there’s no issue with accessibility. Unlike much of the industry’s high-end 3D software, both Unreal Engine and Unity can be downloaded for free by individual users.

If you want to get into CG and virtual production technology, you don’t have to wait. Download the software and jump in.

One thing’s certain—the pace of innovation is not slowing. With so much powerful technology at their fingertips, filmmakers have every incentive time to level up their production game.

And that’s exciting for the future of digital storytelling, whether it’s a movie, show, or game.

Featured image from Unreal Engine Production Spotlight © Epic Games

Bryant Frazer

Bryant is a New York-based journalist specializing in filmmaking technology and technique. For many years, he was the editor of StudioDaily, a daily news source for artists, executives, and craftspeople working in production and post-production for film and television. His writing can also be found at Film Freak Central, where he reviews new and classic movies released on Blu-Ray Disc and, since 1994, at Deep-Focus.com, one of the first generation of film sites on the Internet.

Film Finance: Where to Find Funds for Your Indie Movie

The New Premiere Pro AI Tools I’ll Definitely Be Using

How to Run an Independent Production Company (Without Selling Out)