Mixing Frame Rates in DaVinci Resolve – Part 1: Know Thy Frame rate

This article will walk you through how to work with frame rates in DaVinci Resolve. Specifically, this article is for experienced users, but there is ample information for beginners looking to better understand professional workflows. If you’re an editor, assistant editor, filmmaker, or any kind of video pro, you know that frame rates can be a real bear in post. If that bear gets a hold of you early in a project, you could end up like Leo in The Revenant, limping your way through the rest of the workflow.

Don’t get messed up like Leo. Read the article.

So why DaVinci Resolve? Well, Resolve is one of the most flexible tools in post-production right now. You can use it to make dailies, edit, color grade, finish, create vfx, and any combination thereof. It may not always be the most ideal tool for every situation or workflow, but understanding how to work with DaVinci Resolve can have a major impact on your entire post pipeline. Even if you only use Resolve in one part.

Project Setup

So let’s jump right into a project. I’m going to assume some working knowledge of Resolve, so if you’re completely new to Resolve, please check out the beginner’s guides on Blackmagic’s training site.

In DaVinci Resolve, when you start a project, you select a frame rate in your Project Settings. Once you set the frame rate for the project, every timeline in your project will run at that frame rate. You can’t change this project setting after importing your media. However, Version 16 has introduced features that allow you to add timelines to your project that have different frame rates than your project.

Open up a new project. Click on the gear icon in the lower right-hand corner of the project to open the project settings.

But how should you decide on your project frame rate?

  1. Choose your project frame rate based on the majority of your footage: Production teams usually choose the frame rates before shooting, usually based on the delivery format. And these delivery formats often vary depending on the region— like how most productions in the United States shoot 23.976 frames per second, while in Europe and part of Asia productions use 25 frames per second.
  2. Choose your project frame rate based on your delivery requirements: When deciding the project frame rate, you need to know where your master will be delivered. Broadcast, film theaters, film festivals, Netflix, or the internet each have frame-rate specifications.

On most professional commercial and film shoots, the master frame rate is usually decided during pre-production. Your project frame rate should match that. For documentaries, shorts, personal projects, stock videos, YouTube videos, and student films, frame rates can sometimes be an afterthought (or inconsistent). For projects like these, it’s best to set the project frame rate once you know where the piece will go. That way you can set up your project correctly from the beginning.

Frame rates are difficult beasts. Once a project starts, it’s hard to change the project frame rate. Especially if there are multiple artists involved. If you do end up having to change your project frame rate during a project, you’ll spend a lot of time redoing work. And few projects can bear such wasted time.

If you’re not sure which project frame rate to pick or don’t have all the information for a particular project, 23.976 is a good place to start. For most projects in the United States, project frame rates are set to 23.976. For US-based projects, 23.976 offers the most flexibility for delivery, because it’s easier to convert than other formats.

Select Master Settings after opening your Project Settings. Set the Timeline frame rate to 23.976. This is the setting that you won’t be able to change once you import media. Set your Playback frame rate to match.

If you have a Blackmagic video card, change your Video Format to a matching frame rate. This setting is for your video signal output and won’t affect playback in your GUI. Ensure that your playback card and your monitoring hardware can support whichever resolution you choose. Otherwise, you will not see an image on your external display.

Importing Media

First, for this project example, we will import our media in big chunks to an empty project. Then, we will dig into the metadata of our camera files to get a good idea of what we’ve got. Once we’ve done that, we can organize our files, interpret the footage (if necessary), and prepare it for editing.

Resolve makes it really easy to import lots of footage really fast. This method is faster than digging into folders and dragging and dropping, and can save you lots of time if you have media within subfolders or spread across many folders.

Create a footage bin in your Resolve project. Within the footage bin, create a camera bin for each camera type.

Select a camera bin, and then find the matching camera folder in your media page browser.

Right-click on your top-level camera folder and select “Add Folder and Subfolders into Media Pool.” Resolve will search through the folder and all of the subfolders within for any media to import.

The media will be added to the corresponding camera bin in your project that you selected. Do this for each camera type.

If your footage doesn’t match your project, Resolve will ask if you want to change your project frame rate. This prompt will only appear the first time you import media. After that, your project frame rate will be locked. Since we already set the project to our desired frame rate in this example, we’ll say no when the dialogue box appears. We don’t need the project’s frame rate to match the footage, but rather we’ll interpret the footage to match the project.

Now that we have all of our footage in our project, it’s really easy to see all the frame rate metadata for each camera in one place. We don’t need to dig through bins or subfolders, and we can quickly start breaking down our footage for editing.

Metadata and Script Notes

The FPS Column

But before we start cutting in earnest, we need to take a look at the frame rate metadata. For this example, I’ve imported files from a variety of camera types with a variety of frame rates. The first place to look for frame rate metadata is the FPS column in the Media page.

In the media page, open up a camera bin. Make sure the “list view” is selected on the upper right toolbar of the media window, so you can see the metadata for each file.

You’ll see row for different metadata entries, like clip name, start TC, end TC, and so on. Scroll horizontally over to the FPS column. If you don’t see an FPS column, right-click on the row of metadata labels and check the box for FPS.

As you can see, there are multiple frame rates across the different clips. “But wait a minute, this will mess up the project,” some of you might be thinking. “How do we handle these mixed frame rates?” I hear your concern, but don’t worry. There is a solution.

But first, we need to dig a little deeper into our files to see what’s really going on with the footage. The first place to check is script notes or camera reports.

Script Notes or Camera Reports

Why are script notes and camera reports important? Because cameras don’t always embed all of the necessary metadata in the files. That’s why script notes are great, because they are a manual log of the camera settings on set. All of that information is recorded and saved, regardless of whether the proper metadata was embedded in the files.

If you don’t have detailed script notes, then all you have is the file metadata, which can be missing information or misleading.

Most cheap cameras like drones, camera phones, or camcorders don’t record much metadata (or at least not everything you’ll need for a larger project). As these cameras have become more common during commercial shoots, script notes or camera reports have become even more important.

Frame rates and metadata can be tricky. It isn’t always clear from looking at the FPS metadata column what is going on with the files. The numbers can represent the playback frame rate or the recorded frame rate—or both. There are three different representations possible in the FPS column:

  • The FPS is the playback frame rate, not the recorded frame rate
  • The FPS is the recorded frame rate
  • The FPS is a variable rate like if you’re using camera phone footage or screen recordings

This is why script notes can really help. They make it clear which camera settings were used so you’re not limited to file metadata. If there are discrepancies between the metadata and script notes, you’ll know exactly what’s going on.

The Camera FPS Metadata

There are actually two places to check for frame rate metadata in Resolve.

One is the FPS column as we’ve already discussed. The other is within the Metadata window on the lower right-hand side of the Media Page. The Metadata window displays all the metadata that the files contain, beyond what is listed in the clips list view. The metadata window is also visible on the Edit page.

Click the metadata icon on the top right of the Media Page to open up the Metadata window. You’ll see the basic settings for the selected clip including the playback frame rate. By default, Resolve will also show you all of the various metadata groups.

The metadata group we want to see is the Camera metadata. Click on the down arrow icon in the upper right-hand corner to see the subcategories for the metadata and select Camera. Resolve will display the specific Camera metadata here. Scroll down to find the Camera FPS column.

Something to look out for is that Resolve can add a few extra zeros to the camera FPS metadata. For example, the camera FPS might say 30000 which is actually 300fps not 30000! For Arri files shot at a higher speed, this is common.

The Camera FPS metadata is the recorded frame rate, but not all cameras record a number here. In fact, most don’t. So what does Camera FPS even mean, and how is this different than the frame rate in the FPS column?

When you shoot at higher frame rates, like with an Arri or RED at 60 or 120fps, that metadata generates a value for the Camera FPS. The second metadata frame rate is used to describe the playback rate which is the FPS displayed the Metadata column. Usually, it’s something like 23.976, 25, or 29.97.

Depending on a few different factors with your camera, you might just see a higher frame rate in the FPS column and no information in the Camera FPS metadata. Even if high-speed footage was shot, sometimes there is no Camera FPS metadata, and you’ll only see the playback speed in the FPS column. In that case, you won’t know what the recorded frame rate is. Again, script notes are your friend here.

Four Frame Rate Relationships

So now that we know where to look for our frame rates, we’ll dig a little deeper into what those frame rates mean for your project. There are four frame rate relationships that your files can have to any given project or timeline. Understanding each type will help you decide how to handle your files for any part of the post-production process. If you’re an editor, assistant editor, DIT, finish artist, this knowledge is power.

The graphic below is a good general reference for the four frame rate relationships:

Let’s dig a little deeper into how our metadata helps us understand and describe each of these relationship scenarios.

  1. NATIVE = PROJECT: The FPS column matches our project frame rate and the Camera FPS metadata. That, or there is no Camera FPS metadata. If the latter is true, generally this means that the frame rate of our files matches the frame rate of our project. But it might mean that the files were shot a higher-than-project speed, especially if there is no camera metadata.
  2. NATIVE BASE = PROJECT + HIGH RECORD SPEED: The FPS column matches our project frame rate and the Camera FPS is bigger number than the FPS metadata: These files were shot at a higher speed than the project frame rate, and will play back in slow motion.
  3. NATIVE BASE ≠ PROJECT: The FPS column doesn’t match our project frame rate and there is no Camera FPS metadata. These types of files can be the most confusing. There could be four different reasons this might happen to a clip:
    • The footage was shot with a fast recording speed like 60fps or 120fps with the intention of being slowed down during editing
    • The footage was shot at a different frame rate intentionally for a different market or project
    • The footage was shot at a different frame rate intentionally and recorded at a higher speed for slowing down
    • The footage was shot at a different frame rate unintentionally
  4. NATIVE BASE ≠ PROJECT + HIGHER RECORD SPEED: When you edit these files in a timeline, they will play back at the project frame rate by skipping or duplicating frames by default. Resolve will do its best to match the native frame rate of the clip to the project frame rate, but the results won’t be perfect. If the FPS column doesn’t match our project and the Camera FPS is a higher rate than the FPS metadata, that means these files were shot at higher speed, probably for a different project, market, or deliverable. Or there was a camera limitation.

Understanding these different situations makes it much easier to decide how to handle mixed frame rates in our project. By dealing with them upfront before editing, a project can move much more quickly throughout the rest of the project, leaving more time for creative editing and less late-night cram sessions.

Editing in Resolve

Now that we understand the different possible frame rate relationships in our project, we need to figure out how to actually deal with them. I’m going to propose two approaches to handling frame rates in Resolve before editing:

  1. Changing native frame rates to match the project frame rate
  2. Not changing native frame rates to match the project frame rate

The first option is ideal if you’re staying in Resolve for your whole project including color grading, finishing, etc.

The benefit of the first option is that you won’t need to worry about mixing frame rates during editing. They will all match. This option also makes applying speed changes much simpler and faster.

But this ease and speed also has a drawback. This method will cause the timecode of the files in the project to no longer match the source files on the drive. Any XML, EDL, source timecode burn-in, etc. won’t match the source files. If you don’t care about timecode, conforming back to the source files again, moving shots between different facilities, or anything like that, then this option can make it really easy to edit and finish in Resolve. But if you do care about those things, it can cause a lot of issues.

That’s why the second option is ideal for project where you’re only handling part of the post process in Resolve, and sending out other parts to for color grading, finishing, and/or graphics work.

With this method, timecode embedded in the source files will match the project files, so you can send shots, sequences, or lists to other artists more easily without playback mismatches.

The drawback of the second option is that editors need to understand how to handle mixed frame rates during editing. If there are a lot of mixed frame rates, it is really helpful for editors to reference the frame rate metadata of the files to avoid duplicating or skipping frames when editing. Specific speed changes can be applied to accurately match these source files to the timeline.

In the next edition of this series, we’ll take a look at native frame rates inside DaVinci Resolve. Check out Part 2.

Dan Swierenga

Dan Swierenga is a professional colorist and Flame artist with over 10 years of experience in post production coloring and finishing many feature films, shorts, documentaries and commercials in LA and Chicago. He is the co-founder of the post production blog ThePostProcess.com, a site dedicated to teaching post production skills and techniques.

The Ultimate Guide to Premiere Pro Productions

Insider Tips: Export with Alpha Transparency from Resolve

What is Rolloff? A Pro Colorist Explains