Workflow Breakdown of Every Best Picture and Best Editing Oscar Nominee

We at Frame.io love getting nerdy about workflows, so when we had the opportunity to speak to a few of the teams behind this year’s Oscar nominees, we had to ask about the details. As we spoke to different teams, however, we were struck by several different points of comparison between the films, and that sparked the idea of conducting a broader analysis.

So we reached out to the editorial teams on all 9 Best Picture nominees and all 5 Best Editing nominees, and got the details of their technical workflows. There were 11 films in all, since three films were nominated for both categories. (The Post’s editorial team was unfortunately unavailable, so we relied on second-hand reports for that film)

  • Baby Driver
  • Call Me By Your Name
  • Darkest Hour
  • Dunkirk
  • Get Out
  • I, Tonya
  • Lady Bird
  • Phantom Thread
  • The Post
  • The Shape of Water
  • Three Billboards Outside Ebbing, Missouri

We could have written an article on each of these films, but rather than diving deep into a particular one, we’re presenting an overview, highlighting some of the most interesting elements.

Budget

 

This is not an article about the funding of Hollywood films, but I think that it’s helpful to keep the relative budgets in mind as we look at things like the length of principal photography in the size of the editorial team.

Here’s a look at the budgets for each of these films, from smallest to largest. It’s quite extraordinary how diverse the budgets are on this list of Oscar nominees, from $3.5 million on Call Me By Your Name all the way up to $100 million on Dunkirk.

 

Production Schedule

 

There’s a general trend up and to the right again, with a few exceptions. Production is always the most expensive phase of a feature film, and so it’s not surprising that the lower-budget films generally had tighter shooting schedules. What’s rather shockingly impressive is that the Get Out team shot a four-Oscar-nominated film in just three weeks!

The Post’s shooting schedule might have been longer except that the film was made in a very short amount of time. Steven Spielberg wanted badly to make and release The Post quickly, as soon as he read the script, because he felt that it was an important piece of social commentary for our time. The film was finished less than 9 months after Steven first read the script. That kind of schedule means every phase of the film has to be rushed.

 

Editorial Team

 

Keeping the films in the same order again, let’s compare the sizes of the editorial teams. I excluded post supervisors and VFX editors, keeping this list to editors, assistant editors, associate editors, and editorial assistants.

While it’s certainly true that the two most expensive films had the largest teams, the rest of the nominated films needed no more than 4 on the editorial team. A large part of Dunkirk‘s huge editorial team was needed for the various conform processes (keep reading), but even without that, it had a significantly larger team than any other film on our list.

The Post also had a significant added challenge—they were cutting two films at once! When Steven Spielberg decided to fast-track The Post, he was already in the midst of postproduction on Ready Player One, which couldn’t simply be put on hold for a year. Michael Kahn and Sarah Broshar, the two editors of The Post, thus had to be able to switch from one film to the other on a daily basis. So, I think we can probably cut them a little extra slack.

The Post © 20th Century Fox

I, Tonya’s team had the difficulty of a small budget but over 200 tricky VFX shots. Although Margot Robbie trained hard, a professional skater had to complete the most difficult moves, requiring various tracking techniques to place Margot back into the film, in addition to CG crowds and stadiums. I, Tonya’s small independent budget meant that Steve Jacks had to pull double-duty as first assistant editor and VFX editor (he is credited for both titles). He would be cutting in VFX shots in the morning, then in the afternoon he’d switch to doing sound work, and then he’d sometimes stay late to prep shots to turn over to VFX.

In contrast, Three Billboards’s workflow was so simple that, even with a two-man editing team through most of production, the first assistant editor Nicholas Lipari spent very little time on the technical details. On a given day, he’d spend around two hours ingesting and processing footage, and then he would be able to spend the rest of the day working with the editor Jon Gregory, which is quite an unusual scenario. It’s an unfortunate fact that the duties of the assistant editor are often so different from the duties of the editor that an assistant can perform their job well for years without getting the skills necessary to move up to an editing position.

(Do you agree with that? Disagree? Let us know in the comments or email blog(at)frame.io—we’re preparing an article on that topic.)

 

Postproduction Schedule

 

You might think that large productions like Dunkirk and The Post would have the luxury of spending as much time in the edit as they need, but the reality is that amount of time allocated to postproduction doesn’t correlate very well with budget. I’ve kept the films in order of budget from left to right, and there isn’t a discernible trend at all. Note that I am including only the time spent in postproduction after the end of principal photography, although of course these teams were all hard at work producing assembly edits while the films were being shot.

On studio films, the release date is usually set before they even begin shooting, and the process of jockeying for prime release dates means that a studio film can end up with plenty of time for post, or very little.

The smaller independent films, like Three Billboards and Lady Bird, had a lot more freedom to take as much time as needed in the edit. Even small independent films do sometimes have to hit deadlines of course, which is often a particular film festival that they’re targeting. It’s interesting to note, though, that even though the independent films generally reported having much more flexibility in their postproduction schedule, they didn’t necessarily take the longest.

Sam Rockwell and Frances McDormand, 2018 Best Supporting Actor and Best Actress winners for Three Billboards Outside Ebbing, Missouri. © Fox Searchlight Pictures

In the case of this year’s nominees, a number of different factors dictated the amount of time in post: technical complexity of the edit (Baby Driver and Dunkirk), pressing release dates (The Post), complex VFX (The Shape of Water), with several taking whatever time they felt they needed (Three Billboards, Lady Bird, Call Me By Your Name).

 

Hardware

 

Given the huge range of budgets for these films, it’s a testament to the amazing democratization of the tools of postproduction that these films were cut on similar hardware. In fact, many of these films were cut on laptops from portable hard drives, as well as in full edit suites.

Tweaks to The Shape of Water in a parking lot, photo courtesy of Doug Wilkinson.

It will come as no surprise that Apple’s hardware dominated the list, with most films editing off of Mac Pro “trash cans” in the main edit suites (the new iMac Pros were not available when these films were being edited). Call Me By Your Name’s team opted for top-of-the-line iMacs, and some will be surprised that Darkest Hour and The Shape of Water both used old Mac Pro towers, though The Shape of Water actually moved to Mac Pro trash cans in the middle of the shoot. Old habits die hard.

 

The Edit: Still Standard HD

 

Given Avid’s dominance of the feature film market, it’s a given that each of these films was edited using Avid’s DNx family of codecs, most using the DNx 115 flavor, though four films (Baby Driver, Lady Bird, Call Me By Your Name, and The Shape of Water) used the DNx 36 flavor. If you’re more used to the ProRes family, DNx 36 is comparable to ProRes Proxy, while DNx 115 is comparable to ProRes 422 (you can see a comparison table here).

Comparison of DNx codecs. You can find the full chart here.

Although it’s becoming easier and easier to edit in 4K (and these productions certainly had the budgets for high-end computers), all 11 teams chose to edit in standard 1920 x 1080 resolution. For readers who are used to working on smaller productions where the entire postproduction process happens at the same facility, this might be surprising. These films were all captured or scanned at 4K or higher, and the budgets were plenty large enough to pay for top-of-the-line hardware, so why not edit 4K? If people are editing feature films in raw 6K, surely it can be done in 4K without too much trouble.

Why Not 4K?

If you consider the workflows of these types of films, though, a 4K edit still doesn’t make much sense. Before we ask why these films wouldn’t edit in 4K, we first we have to ask ourselves, “what are the benefits of editing in 4K?”

The primary benefit for most people who edit in 4K is simply the fact that they can skip the offline editing process and do all of their work directly on the camera-original files. Although it’s very easy to design a smooth offline workflow, and all of the major editing packages support it, there is some nice simplicity in avoiding the need to transcode for an offline workflow. If you are doing your color correction and finishing inside of your editor (which is increasingly possible), you have the added significant advantage of being able to move fluidly between phases of your postproduction process. You can spend more time on temporary color-correction as you are editing, knowing that you can continue that work later rather than having to start over again. You can also make edits to the film during the finishing phase, without the headaches of a reverse conform process.

But these feature films, even the low budget ones, all used a traditional offline workflow that involved a handoff from the editors to a separate finishing team at another facility. So there was no possibility of the kind of all-in-one workflows that are now becoming feasible.

As amazing as the iMac Pros and Z840s are for renderless 4K editing, there are still plenty of hiccups and slow-downs involved with editing a significant film in 4K. Temp VFX and color-correction can quickly choke playback on a system that’s not perfectly tuned, and render times stretch out.

The other significant disadvantage to a 4K edit for these films is the size of the storage required, even if you edit off of proxy files instead of the original camera files. At different points in the process, the teams behind Baby Driver, Three Billboards, and The Shape of Water had a complete copy of the film running off of a single hard drive connected to a portable laptop, which would not have been possible had they edited in 4K.

Baby Driver editor Paul Machliss’s laptop and Lacie hard drive on set. Image courtesy Paul Machliss.

A 2-hour feature film might easily have a 50:1 shooting ratio (capturing 50x as much footage as ends up in the final film), which means 6,000 minutes of recorded footage. Using our bitrate formulas, we can quickly calculate the space required.

ProRes 422 at UHD is 503mbps, so we plug our numbers into the top formula:

503mbps * 6,000 minutes * .0075 = 22,635GB, or 22.5TB.

22TB is reasonable to store in a powered RAID, but you’re going to have trouble throwing that in your backpack.

So, if you can’t take advantage of the primary benefits of a 4K workflow, and it will slow you down and hamper you, it just doesn’t make any sense to use cutting-edge 4K workflows purely for their own sake. Even if you’re cutting Dunkirk.

The one benefit that these productions would have received from a 4K edit is simply the greater resolution and detail during the editing process. That can be an advantage, but just not a big enough advantage to tempt these teams …yet.

Note that I’m specifically talking about editing in 4K vs 1080p, which is a separate question from whether to capture and master at 4K or 1080p. All of these films used a traditional offline workflow, which allowed them to edit in 1080p but produce a final output at a higher resolution.

 

Film vs Digital Acquisition

 

The decision to shoot on film vs. digital is always a fraught one. It can be a question of budget, a question of personal taste, or a question of subject matter. All of the films on this list had enough of a budget to consider film, but it’s interesting to note that, of the six movies shot on film (Baby Driver, I, Tonya, Call Me By Your Name, The Post, Phantom Thread, and Dunkirk), five of them were period pieces of some sort. Films set in the past tend to shoot on film more often than films set in the present because we tend to associate the look of older film stocks with the past.

Of course, it’s always possible to emulate the look of an older film stock with a digital image, but I’m trying not to take sides too much here 🙂

The only non-period-piece to use film was Baby Driver, which is even more surprising given the fact that the film’s synchronization to its soundtrack required real-time, on-set editing (more on that later). Shooting digitally could have simplified this workflow, but Edgar Wright is a film purist, and it’s pretty hard to argue with a director like Edgar Wright.

  • Baby Driver: 35mm, Alexa Mini at 2.8K ARRIRAW.
  • Call Me By Your Name: 35mm.
  • Darkest Hour: Alexa SXT, Alexa Mini at 3.4K ARRIRAW.
  • Dunkirk: IMAX 65mm and Standard 65mm film.
  • Get Out: Alexa Mini at 3.2K ProRes 4444.
  • I, Tonya: 35mm, Alexa 65 at 6.5K ARRIRAW, Phantom.
  • Lady Bird: Alexa Mini at 2K ProRes 4444.
  • Phantom Thread: 35mm film.
  • The Post: 35mm film.
  • The Shape of Water: Alexa Mini at 3.2K ARRIRAW.
  • Three Billboards Outside Ebbing, Missouri: Alexa XT Plus at 2K ARRIRAW, Ursa Mini.

Of these 11 films, Dunkirk was the only film to do an optical print with no digital intermediate. This was again a personal aesthetic choice by Christopher Nolan to avoid the digital color tools and restrict the color treatment to the limited tools of the traditional photochemical color-timing process.

The five other films that shot on physical film all scanned the images and did their color correction and online effects on digital files. They then exported DCPs (digital files) so that the films could be projected digitally, or else printed those digital files back onto film for a film projection (Phantom Thread). Dunkirk, instead, was projected on film using prints made from the original camera negatives where possible.

There are several caveats, however. Dunkirk still did a digital scan of the camera negatives, for several different reasons. In spite of his love of the physical film, not even Christopher Nolan can deny the obvious advantages of digital editing, and so the film had to be scanned and transcoded to a 1080p digital intermediate codec for the editing. Second, although Christopher Nolan tried to do as many effects in-camera as possible, it was still necessary to do some digital VFX on certain scenes. Those scenes were scanned at 8K and delivered to the visual effects house, which worked at 6K. Those 6K files were then printed back to film negative and spliced in with the prints that had come from camera negatives.

Dunkirk. © Warner Bros.

There was also yet a third complete scan in order to provide digital delivery as well, since many theaters have moved to digital projection only. That scan was done after the film had been color-timed photochemically, though, so there was no need to do any digital color manipulation other than what was necessary to match the look of the digital files as closely as possible to the look of the all-analog version.

(I’ve barely scratched the surface of the extraordinarily complicated workflow on Dunkirk. I recommend this article from the ASC and this interview with Steve Hullfish if you’d like to dive deeper.)

 

Dailies

 

I mentioned above that every film used Avid’s DNx offline codecs for editorial, although a couple of these films added a few extra twists to the process.

Most of these films followed a fairly standard dailies process. For those films that shot on 35mm, the negatives were immediately scanned to digital files, and then shown to the production team digitally, either by sending a physical hard drive or via a remote dailies viewing system. For the films that shot digitally, the files were transcoded and then delivered to the production team via hard drive or via the cloud. Processing dailies is never “simple”—color needs to be properly managed, sound synced, and metadata carefully managed—but the process is fairly standardized.

Two of these films had particularly complicated dailies workflows, though for completely different reasons.

For Baby Driver, the challenge came from the fact that the film was timed extremely precisely to the soundtrack. The main character is constantly listening to music on his earbuds throughout the film, and every scene is precisely choreographed to fit the music. Every shot had to be timed in order to match this soundtrack, which was piped to the actors via hidden earpieces.

The cast of Baby Driver wore earpieces in order to move in sync to the music.

That obviously presented as a huge synchronization challenge, which they addressed by referring to detailed animatics, matching the live action footage, shot by shot, to the pre-edited animatics. In order to be sure that they were hitting their beats precisely correctly, it was necessary to edit the film in real-time, dropping each take into the animatic timeline to make sure that everything lined up as planned.

Since the film was shot on 35mm film, they weren’t able to use the “real” dailies to do this on-set check. Instead, the film’s editor Paul Machliss received a signal from video village, which was captured in real time using Qtake’s video assist software. Qtake recorded ProRes files, which could be imported into Avid on Paul’s Macbook Pro, but Paul would then transcode those files in the background to DNxHD 36, stored on an external hard drive.

So Paul was building his assembly edits with these temporary dailies, but he was also receiving the dailies that have been scanned from the captured 35mm negatives. Since his temporary dailies were lower quality, he swapped out the 35mm scans as soon as they arrived. Of course, it takes a few days for negatives to be mailed from Atlanta to Los Angeles, scanned, processed, and then sent back to Atlanta, Paul’s on-set assistant had to swap out temporary dailies for “real” 35mm-scanned dailies, on a daily basis. (that was a lot of dailies)

Since these temporary dailies were coming from a video tap which recorded independently from the film camera, the clips coming in from the 35mm scans didn’t match the temporary dailies precisely. So the new dailies would have to be carefully linked back into the Avid timelines, replacing the temporary dailies without losing sync. Fortunately, reliable time-of-day timecode saved the day, allowing much of the relinking to happen smoothly.

(edit: we earlier reported that the temporary dailies came in at 25fps. That was incorrect: Paul had to work with proxies from a 25fps video tap on The World’s End, but not on Baby Driver. We regret the error.)

Dunkirk’s dailies workflow was even more complicated than Baby Driver’s workflow, though without the real-time component. All of the complications in Dunkirk‘s workflow were due to the fact that the production was shooting simultaneously in two different non-standard types of physical film, and Christopher Nolan wanted to view his dailies projected by a real film projector using a third non-standard type of film.

In the days before digital capture and digital intermediate, everyone had to view their dailies on film—that element was just a throwback to the old days, nothing particularly new. The tricky part was that Christopher Nolan wanted to shoot part of the film on IMAX 65mm, part of the film on standard 65mm, and to view his dailies at IMAX sizes.

In spite of the similar-sounding names, IMAX 65mm and standard 65mm have different sizes and different aspect ratios. The names are misleading because IMAX 65mm is 65mm tall, while standard 65mm is 65mm wide, resulting in two completely different types of film being known by the same number.

Comparative sizes of standard 35mm film, standard 65mm film, and IMAX 65mm film
Standard 65mm film is as wide as IMAX 65mm film is tall.

Incidentally, this is exactly the same reason why 35mm cinema film and 35mm still camera film are not the same size or shape. 35mm cinema film is measured by the horizontal side, whereas 35mm still camera film is measured by the vertical side.

Also, you may have heard that the film was projected at IMAX 70mm, not IMAX 65mm. This is another issue of confusing terminology. The image is exactly the same size in both cases—the extra 5mm is used by the soundtrack during projection. So you always capture IMAX at 65mm and display it at 70mm, but the image is not scaled or cropped in the process. Same thing with standard 65mm film—it’s projected at standard 70mm (not IMAX 70mm).

Yeah, confusing.

Ok, back to Dunkirk. Christopher Nolan wanted to view his dailies on physical film, and since he was shooting two different formats, he received dailies in two different formats. All of the standard 65mm shots were printed to standard 70mm, requiring the production team to carry around a 70mm projector with them. Given the amount of material shot, it was simply too expensive to create IMAX prints of all of the IMAX footage. Only selected shots were reviewed at IMAX 70mm—most were reduced down to fit onto a standard 35mm print.

Because of the time required to process the 35mm reductions, they had to skip syncing sound on the all of the IMAX dailies. Fortunately, all of the dialogue scenes were shot on standard 65mm, and so they were able to view dailies with sound on the dialogue scenes.

The dailies on The Shape of Water were much simpler technically, but they had to keep a very tight ship because the editorial team worked at the studio where the film was mostly shot. Director and co-writer Guillermo Del Toro would drop in frequently, usually during the lunch break, which meant that the assistants had to work very quickly in order to ingest the previous day’s footage, allowing the editor Sidney Wolinsky time to work with the footage and have something to show Guillermo by lunchtime.

Distributed Workflows

 

There aren’t many film labs capable of handling all of the complexities of Dunkirk’s workflows (and I’ve skipped over several other tricky bits), so they had to ship all of the film from Europe, where they shot  most of the film, to LA where IMAX and FotoKem processed dailies, mailing copies back to Europe for the production team. With all of that turnaround time, dailies took around a week to return to production. (Can you still call them dailies if they take a week? Not sure.)

Call Me By Your Name’s team was also distributed around the world, but where Dunkirk’s workflow was complex, theirs was simple, and where Dunkirk’s workflow was simple, theirs was complex.

Dailies were much simpler for Call Me By Your Name since they were working with standard 35mm film. They did have to send dailies from Crema in northern Italy to Rome for development and scanning, but the workflow remained entirely digital from that point on, greatly facilitating their distributed postproduction workflow.

Call Me By Your Name © Sony Pictures Classics

The editorial team remained in Crema, but the rest of the postproduction was scattered across the world. The color grading was done remotely in Thailand, the sound mixing was done in France, the VFX and Sound Design in Rome, and the ADR in LA. The director Luca Guadagnino was even able to direct ADR sessions in LA from a suite in Italy, syncing their Pro Tools sessions so that the director and editor were able to listen in live as the actors delivered their performances in LA, viewing the video correctly synced, and give direction as though they were in the same room.

Get Out’s team was much less spread out than Call Me By Your Name’s team, but they still made use of remote collaboration tools, as did many of the films on our list. Matthew Poliquin at Ingenuity Studios, which produced Get Out’s (mostly invisible) VFX used Frame.io’s cloud-based review and feedback tools even for internal communication, within the same building.

While it’s hard to argue against the benefits of having the director and editor in the same room for the editorial process on a feature film, cloud platforms like Frame.io make it easier to communicate and provide feedback asynchronously. That’s especially true for someone like Matthew working on VFX, who needs to coordinate feedback for many artists working separately, using Frame.io as a single, central platform to track feedback on various drafts of each shot.

The “Sunken Place” scene in Get Out was one of the most challenging for the VFX team at Ingenuity Studios. Find out why.

 

What Really Matters

 

As fulfilling and exciting as it may be to learn about the workflows behind these amazing Oscar-nominated films, at the end of the day, it’s not about the awards. Christopher Nolan doesn’t watch IMAX-sized dailies and Edgar Wright doesn’t meticulously choreograph car chases to music to win an award. I’m sure everyone we talked to on this list (and their collaborators and counterparts on the films not nominated) would all say the same thing—they do what they do for the love of the craft and the pursuit of excellence. That pursuit is what drives every passionate artist. It’s what drives us to create amazing software (or coordinate dozens of interviews with hard-to-reach professionals to bring you the best possible blog.) And it’s our sincere desire that this information will inspire you to do the same. And if you happen to win an Oscar along the way, that’s just icing on the cake.

14th time’s a charm! Imagine if Deakins was only in it for the awards. He finally gets best cinematography!

 

Acknowledgments

 

Thank you very much to the busy people who took the time to answer our questions about these films: Paul Machliss, Steve Jacks, Tommaso Gallone, Francesca Addonizio, Mary Juric, Chema Gomez, Doug Wilkinson, Cam McLauchlin, Trevor Lindborg, Nicholas Lipari, Francesca Addonizio, John Lee, Crystal Platas, Nick Ramirez, and Matthew Poliquin.

Written by David Kong

Filmmaker, teacher, coder. Director of Content at Frame.io. You can learn more about my various adventures at www.davidkong.net

Interested in contributing?

This blog relies on people like to step in and add your voice. Send us an email: blog at frame.io if you have an idea for a post or want to write one yourself.