Made in Frame: Foo Fighters Try Death Metal in Horror-Comedy “Studio 666”

A short timeline. A global pandemic. A production shutdown. Three different editors from three continents. It’s the stuff of movie-making nightmares.

Or is it?

In the way that Dave Grohl makes an elaborate arena show look like a big party, our partners at Blackmagic Design created a bespoke remote workflow that made the Foo Fighters’ new comic-horror picture, Studio 666, the kind of project that the crew described as “a dream,” despite the real-world challenges.

In this installment of Made in Frame, we’ll show you how a cutting-edge DaVinci Resolve-Frame.io pipeline provides a glimpse into the future of filmmaking by keeping the demons of time, money, and stress at bay.

Monstrously creative

Dave Grohl’s creative ambitions can sometimes exceed what seems humanly possible to achieve within the confines of the 24-hour day and 365-day calendar.

But his boundless enthusiasm and energy serve as inspiration to everyone in the extended Foo family.

Dave’s 2020-2021 calendar included directing a feature documentary, writing a best-selling memoir, performing stadium tour dates, and collaborating with musical artists from St. Vincent to drumming wunderkind Nandi Bushell—not to mention recording the Bee Gees cover LP, Hail Satin.

He also produced the Foos tenth studio album, Medicine at Midnight, recording at “an unnamed Encino house.” The house apparently had a distinctly creepy vibe, sparking Dave (along with writers Jeff Buhler and Rebecca Hughes) to create Studio 666.

In a nod to the campy rock-and-roll movies of the 1960s (think, The Beatles’ Help!) in which the band members find themselves in some kind of fictional peril, Dave’s three requirements were that it be scary, funny, and disgusting.

Producer Jim Rota recounts that they began by enlisting the help of special effects wizard (and Oxcart Assembly partner) Tony Gardner. As the mastermind responsible for some of the most iconic puppet and makeup creations in cinema history (Chucky, Bad Grandpa) he was uniquely suited for the project.

“We started with the killing,” Jim says. “We asked Tony what some of the ultimate ways were that he’d wanted to kill movie characters that he hadn’t done before. Then we gave that to Dave and the writers, and that became the basis for the script.”

When real life gets scary

Also no stranger to horror, director BJ McDonnell helmed the film, which they shot on location at the very same spooky Encino mansion.

The team planned for a brisk shoot, approximately four weeks, to work within the band’s limited availability. “The guys have a devastating schedule,” Jim says.

“They’re recording and touring—rinse, repeat. So if they have time to do something, it all has to be done within a four- or six-month period.”

Principal photography commenced in February 2020, while Dave was in post-production on his documentary. And then one of the scariest events in modern history halted production on March 13, 2020 (yeah, it was a Friday).

“We had about a week left to shoot,” Jim says. “But because we had to shut down for COVID, when we finally came back in August, what should have taken a week ended up taking more like three weeks.”

Which is why it was a good thing they had devised a workflow that kept them working even through the pandemic.

Reducing fear

The production’s workflow was always designed to save time, money, and stress—even before COVID came into play.

Rather than having a more traditional DIT cart on set, the team employed a DAM (digital asset management) cart operated by Mike Smollin, color pipeline and workflow manager at Blackmagic Design.

“It’s really more like a mobile post-production lab,” Mike says. “You’re essentially creating the dailies and delivering directly to editorial rather than to a post facility.”

The team shot at 3.4K on two ARRI Alexa minis. They also used a BMD Ursa Mini to capture VFX plates at 12K.

During breaks in the shoot for setups or meals, Mike took advantage of the time to offload the camera cards, sync the dailies, apply the show LUT—or even do some additional color correction—and upload to Frame.io.

Important to note is that Mike’s color correction was applied on top of the files rather than being baked in, allowing them the ability to change the color before delivering the final film.

“I rendered H.264 files to a local folder first,” Mike says. “You can choose if you want to render the entire timeline, or only load certain clips to Frame.io for everyone to review, like circle takes.”

“What’s great is that you’ve got the director and the DP still physically there on the set. We’re able to view the morning’s dailies at lunchtime, and then shortly after wrap they could see the rest of the day’s shoot.”

Which means that dailies became hourlies, and everyone who needed to know if they’ve got coverage could move on confidently.

This process also served to bring editorial closer to the set, even though they were in a different location during shooting—at the band’s post facility, called 607. The editor and assistant could view the material in Frame.io, and by not needing to wait the day (or two or three) for dailies to be processed and transcoded, they were able to begin ingesting the 3.4K footage immediately upon receiving the drives and start cutting scenes together.

For a production on a compressed schedule, this was vital. Especially because there are also approximately 300 VFX shots in the film, many of which included practical elements captured on set. Editorial was able to communicate back to the set, in nearly real time, to alert them to any problems or request additional coverage.

The advantages were numerous. First, it meant that they had only one scene that needed a reshoot after principal photography wrapped. Second, it answered the questions that directors and DPs have asked themselves for decades as they lay awake in bed at night. Did we get the shot? Are we good? “They’re not worrying about the day that’s behind them,” Mike says. “They have peace of mind and can move on to the next day.”

Third, and certainly not least, when COVID shut down the production and editorial needed to work from home, they were able to easily pivot to a remote workflow without missing a beat.

Cloudy visions

The on-premises setup at 607, where the film’s first editor, Byron Wong, and assistant Crystal Pastis were working, was unique from the start.

“This exact workflow hadn’t been done before,” Crystal says. “A lot of it was using the tools we already had and then developing the workflow as we went. It was a new experience for me.”

Ingesting the ARRIRAW 3.4K files to Resolve with an on-set color correction meant that they never had to worry about relinking.

It also meant that when they showed sequences or cuts to the director, DP, and producers, they were seeing something that was quite close to the vision for what the final film would look like.

“Typically you work in proxies,” Crystal says. “It’s kind of unheard of to work in 4K, so that was really incredible. Even the outputs we did looked amazing because we were seeing the RAW, so it wasn’t like you’re working off of proxies and making exports—making a compression of a compression.”

But another key to the sudden shift to working from home was that they had a cloud-based workflow, storing their assets and media on both an on-prem LumaForge Jellyfish and in the cloud in AWS so they could remotely access whatever they needed from wherever they were.

“We had the two environments because we needed a backup,” Jim Rota explains.

“We had workstations that the editors could log into and be locally connected to the storage at the office. And then we also simultaneously ran virtual machines with another copy of all the data in the cloud on AWS, so they could log into the virtual machine instead and edit in the cloud. If there were issues with either system, we had it set up so that we could jump back and forth, just to stay on schedule.”

In all the configurations and scenarios, Frame.io was key to the review process.

…and the final frame from the movie (complete with jump scare).
…and the final frame from the movie (complete with jump scare).

According to Jim, there were somewhere between 12-15 people weighing in on the material, between the director and DP, the band (who were also producers on the film), Jim and his producing partner, John Ramsay of Roswell Films, and VFX supervisor Matt DeJohn.

Casting out demons

Part of the fallout from the COVID hiatus was that of scheduling crew. When Byron moved onto another project, editor Andy Canny took over. He’d been in the U.S. during much of COVID, so when he finished the assembly of Studio 666, he headed home to Australia to see his family after more than a year’s separation.

Enter third editor Chris Dickens. Although he’s won an Oscar for Slumdog Millionaire, that wasn’t why they brought him on to the film. Actually, it was because he’d cut Shaun of the Dead, another creative inspiration for the team.

Chris was accustomed to working in Avid and to the more traditional offline-and-conform workflow. But unlike some editors, who love to play with the latest tech, “I’m not a technical person,” he says. “I’m all for technology, but I only care about how it enables the creative side.”

Coming into a Resolve project for the first time presented some new challenges—especially because he was based in a farmhouse in the southwest of England in “the middle of nowhere” while the rest of the team was in Los Angeles. But it also came with certain advantages.

For one, Chris was able to spend some time doing Resolve tutorials without having a director sitting right next to him.

Crystal, who was the assistant on the movie from the start, functioned as the keeper of the cut and the process as it moved from one editor to another. She’d had the advantage of working with both Resolve and Frame.io on previous projects, and when Chris came on board, she had the project and workflow well in hand.

“All I had here at home was my computer,” Chris says. “Crystal took care of all the technology in LA.”

As Chris became more comfortable editing in Resolve, he made some discoveries. “I found the system extremely flexible,” he says.

As for collaborating through Frame.io, he was surprised to learn how easy it had become to work remotely. “For the last couple of years a lot of people had to deal with different ways of working,” he says. “The last project I worked on was okay, but it was limited. But on this project, it went amazingly well.”

By the time Chris started work on Studio 666 they already had a cut that was in good shape. His task was to do the final polish of the cut, which he describes as “putting back some of what had been removed and letting it breathe a little.”

It’s a highly creative and collaborative phase of post-production, particularly when finessing the comedic timing of what was often unscripted material. The Foos are a notoriously funny bunch, and their fans always appreciate their authentic and offbeat humor.

While Chris generally prefers being in the room with the director, he was grateful for the detailed and specific notes he received in Frame.io. Even from the far reaches of another continent and across numerous time zones, he was able to stay creatively aligned with the production team.

Even from the far reaches of another continent and across numerous time zones, he was able to stay creatively aligned with the production team.

“I think the best way I can describe it is that it was transparent,” he says. “The detail is amazing because it points you in the direction of what people are really thinking.”

The experience was so enlightening for Chris that he now believes this is the workflow of the future. “I’ve been doing more of this on subsequent projects. It’s so flexible that you could be almost anywhere and just log in.”

Ghosts in the machine

Almost every horror movie carries a hefty visual effects load, and Studio 666 is no different.

But when you’ve got Tony Gardner on board to kill your characters, you lean into how much you can do with practical effects.

Matt DeJohn, VFX Supervisor, explains their approach to the approximately 300 shots he supervised.

“For the film’s big kills we took the amazing practicals and removed rigs, added dangerous bits like the tip of a blade, and married various elements—like swapping out a real actor with a bloody dummy—together to create the final shot,” Matt says.

“Alongside these bigger visual effects shots was a wide range of work, including paint outs, monitor comps, rain, smoke simulations, and other effects.” Those were achieved primarily in Blackmagic Fusion, with help from Blender and Autodesk’s Flame.

Then there were the shadow people. “They were a major effect throughout the film, and dialing in a look that worked for the budget was a challenge. Geoff Stephenson from Therapy Studios took on the brunt of that effort and did a great job as we explored a variety of options with BJ,” Matt says.

“The shadow people started out as actors wearing black suits and by digitally adding ghostly smoke, evil eyes, and glowing mouths full of sharp teeth we were able to bring those characters to life. For the ghosts in the finale, we took a lesson from, and paid homage to, the ghosts from Raiders of the Lost Ark. We filmed ghost puppets in a water tank to give them a natural flowing motion, and digitally added the swirling smoke trails and light effects to sell the effect.”

On set, they used the Blackmagic URSA 12K to capture VFX plates, particularly for the ghost water tank shoot, which they shot at 48fps.

“Capturing at that high resolution allowed us to reframe those elements so it looked as if the ghosts were passing right by the camera without losing any detail,” he explains.

“Because the Blackmagic RAW codec is so efficient, we were able to mock up the staging of the ghost elements directly in the Resolve edit timeline at full resolution and still get real-time playback—even with 8-12 elements all playing back simultaneously in 12K. The higher frame rate made the ghosts’ movement more ethereal and gave us the flexibility to retime things cleanly in post.”

Another unique aspect of the workflow was that because they were editing with the camera RAW files in Resolve, they were working with the same files throughout every phase of post-production.

“We were all working in ARRI log C and we also made linear .exr files. Some of it was done in standalone Fusion, and some of it was done on the Fusion page in Resolve,” Mike Smollin says.

“So there was no figuring out what the color spaces were and how it was going to work. You just bring it in, you work on it, you final it, and you publish it to the color page. And that’s it. It’s done.”

Crystal agrees. “Because I was giving the VFX team the .exr files at the highest resolution, they could give them right back to me and if BJ liked the shot, all we had to do was mark it from temp to final.”

Frame.io was especially important to the VFX component of post-production because COVID forced the team to largely work from home.

Matt had used Frame.io on previous projects but says that this show “really took full advantage of its functionality. We used it initially to access dailies so we could identify assets for our look development and additional elements that would be needed for VFX.”

Beyond that, they used Frame.io to “review the VFX up until the final pass with EXRs,” Matt says.

“This allowed me and BJ to jot down notes and mark up frames [using the annotation tools ] quickly and easily wherever we happened to be. The ease of access and intuitive interface made it simple and efficient for everyone to use and collaborate.”

Crystal underscores the value. “We’re all working remotely and BJ says, ‘I want more smoke.’ Okay, but where do you want that smoke and where do you want it coming out of? Having that capability of being able to specify it on the exact frame, like, ‘I need smoke coming out of this demon at this size’ saved a ton of time and was hugely important,” she says.

Your secret is safe

Crystal, who became a Frame.io power user on this film, put it to the test for so many aspects of the workflow.

Resolve lets you upload directly to Frame.io, so if the director or one of the producers wanted to see something, like all the takes of Dave shredding on the guitar, for example, it was easy to just string them together and upload them to Frame.io,” she says.

“In a pre-COVID situation, they’d come into the room and we’d scrub through it for them, but because we were all remote, Frame.io gave them the ability to see everything the way we’d normally show it to them from wherever they were.”

But perhaps even more importantly, Crystal leveraged Frame.io’s security features in some significantly time-saving ways. The Foos are known for surprising their fans, so making sure that everything stayed secure while working across continents was a key concern.

“We had to make sure that for someone like me who’s handling all these outputs, nothing was accidentally leaked. That’s one of my biggest fears.”

“We took security very seriously because this was all done in secret during COVID. We had to make sure that for someone like me who’s handling all these outputs, nothing was accidentally leaked. That’s one of my biggest fears,” Crystal says.

“The capabilities that Frame.io has for security are awesome. There’s a great watermarking feature and, for editors and assistants, it’s wonderful because you can add a watermark after the fact and customize it,” Crystal says.

“So let’s say you have a two-hour cut and it’s exporting in real time and then somebody says, ‘Oh, we actually need another version of this and sorry, I didn’t tell you, but you need to watermark it.’”

“But once it’s in Frame.io you can just copy it and put new individual watermarks on it or remove them. Or I could put specific passwords on specific cuts to make sure that that cut went to only the producers and the director.”

The time savings had a direct impact on Crystal’s workload.

“There were some days that I had to do four or five outputs with different peoples’ names watermarked. And for me that was just one two-hour long export that I uploaded and then could copy it as many times as I needed with the four people’s names watermarked on each one,” she says.

“So on days like that, it could save me six or eight hours of waiting for everything to output. Especially if you’re working in an office and you can’t leave until they’re all done, it’s huge.”

Crystal also found that it was easy to take old cuts down to eliminate confusion. “It was very easy to archive anything that was no longer relevant,” she says.

So on days like that, it could save me six or eight hours of waiting for everything to output. Especially if you’re working in an office and you can’t leave until they’re all done, it’s huge.

“Having Frame.io as a tool was incredible, especially in this new world where we’re half in office and half remote.”

Killing it

There’s no doubt that the outcome of this project benefited everyone involved.

For example, not only did the production not have to spend money creating dailies (or waste time waiting for them), they were also able to do final color grading within Resolve—even while the cut was ongoing.

“As soon as a reel was locked, we could start grading, which we did in our own office,” Jim Rota says.

“Honestly, we could have done post-production out of anyone’s house that had a Sony X300 or X310. We also did all of our deliverables out of Resolve, which meant that our entire workflow was lab-free. Sound was the only thing we did on an actual stage.”

Matt DeJohn was happiest about being able to easily shoot practical effects and work within Resolve to combine them with digital visual effects. “There are clear cases where VFX are the best or only solution,” he says.

“But we found opportunities to shoot practical elements, like the ghosts, for objects that would usually be fully CG. That gave us a real-world starting point to ground those creatures, in addition to being a fun creative homage.”

Mike Smollin agrees that this workflow enabled them to get the most bang for their VFX buck.

“There were some very intricate shots at the end of the film. Sometimes there are as many as eight elements stacked on top of each other. If we’d had to export all the plates and elements and get them to a VFX house and get it turned around in a timeframe that was acceptable to the director, I don’t think we could have done it,” he says.

Frame.io believes that it’s our mission to empower creatives, no matter their choice of tools. We’re thrilled to see any team engineer a high-resolution, cloud-based, all-in-one solution that allows them to work efficiently and creatively—on an indie budget.

So don’t be afraid of the unknown. There are solutions that’ll help you kill your next project, in all the best ways.


Original photography by Irina Logra.

Lisa McNamara

Lisa McNamara is Frame.io's senior content writer and a frequent contributor to The Frame.io Insider. She has worked in film and video post-production approximately since dinosaurs roamed the planet.