Mixing Frame Rates in DaVinci Resolve – Part 3: Editing and Interpolation

Welcome to part 3 of our series on editing mixed frame rate in DaVinci Resolve.

A useful method for editing mixed frame rate material is to leave the source files at their native frame rates before you start cutting. In my experience, this type of workflow is more common in the commercial post world, as leaving frame rates at their native frame rate preserves timecode consistency across the project and file(s). And in the commercial world, since many shots and sequences are sent out to other facilities or software, matching timecode is essential.

But, editing clips with different native frame rates is more tedious than simply conforming them all together with clip attributes. By default, if you edit non-matching frame rate files to a timeline, your files will playback choppy. That said, most editors don’t worry about this and just edit mixed frame rates together without realizing it. But, this method can turn into a big headache down the road.

Logging Metadata

As we’ve mentioned in previous sections, the first step to sorting out all this mixed frame rate business is to add and organize metadata. This is extra important when it comes time for editing, and can be used to help smooth out playback issues.

Percentage Metadata for Smooth Playback

Logging non-matching frame rate files is less involved than changing the simply native frame rates, but can still deliver similar results.

For this method, you’ll need to change the speed of the clips once they’re edited to your timeline, which will ensure they play back smooth. Keep in mind, it can be helpful to add those percentages to the camera notes field for quick reference, and to add the recorded frame rate (as we did earlier when changing the native frame rate in the Camera FPS field).

For example, imagine we have clips that are 59.94p natively. If we edit this clip in a 23.976fps timeline, Resolve will throw out frames at an uneven cadence to achieve real-time playback by default. To avoid this issue, we can just change the speed to 40% — 23.976 / 59.94 = .4 or 40%. Changing the speed to 40% will make the clip play back without dropping any frames.

Go through each non-matching-FPS clip in your camera bins. Divide that FPS number by your project frame rate. Now add this percentage number to the camera notes field. For example, “40% native” for 59.94 clips in a 23.976 timeline. This is the number that you’ll use for any speed changes when editing your clips to the timeline.

This may seem tedious or unnecessary work at first, but it can really benefit the whole post process if it’s done right the first time. Having this information close at hand can really save editors a lot of time.

Camera FPS metadata

If a clip is high speed, you’ll need to read the previous section on finding and adding the recorded frame rates to the Camera FPS metadata field for your clips. This will allow you to retime clips more easily.

Re-Timing with Non-Matching Frame Rates

In contrast to matching frame rates, where you should use multiples of 100, editing clips with mixed frame rates requires you use multiples of the divided percentage number for smooth playback. If you are re-timing footage, obviously you don’t want the choppy playback, so you’ll need to use multiples of that new percentage for any speed changes.

For the 59.94p to 23.976 example, doubling that 40% slow down to 80% will, of course, speed the footage up. Now the footage will play back closer to real-time without any uneven cadences. Frames will still be thrown out, but they will be thrown out evenly, so there won’t be any jerky or skipped motion using this method. By using multiples of 40% like 80%, 120%, and so on, we can achieve smooth playback across a range of re-timed speeds.

It’s important to remember that clips with frame rates that aren’t perfectly divisible won’t playback smoothly in real-time without frame interpolation. It’s also notable that synced audio will have issues with re-timing and frame interpolation if not matched to the timeline.

Frame Interpolation

The re-timing process defines how Resolve handles any non-matching frame rate or speed change. In those settings, you’ll find options for frame blending and optical flow, which are methods of rebuilding missing frames through interpolation.

The settings for this can be found in the project settings and in the master settings. Do remember, the project settings can be overridden for any clip in your timeline with other settings.

By default, Nearest is the setting used for clips, which adds or throws out frames any time you change the clip speed or have non-matching frame rates. It’s generally a good idea to keep this setting at Nearest during editing, as the other options are more processor intensive.

However, other settings, like Optical Flow, smooth out uneven motion, speed changes, and non-matching frame rates fairly well. This setting is usually applied once an edit is closer to being finished, or to test out if optical flow work well on a particular clip. Editing with it on constantly will be a drain on your system resources, and might interrupt smooth playback.

But each of these settings behaves differently, and can be useful in certain situations, so let’s take a look at them individually

Nearest

  • Frames are either dropped or duplicated
  • This is the default setting when mixing frame rates
  • Least processor intensive

Nearest is the default setting for retiming, and is the least processor-intensive setting when re-timing clips. You will see stuttering with fast-moving footage or using uneven speed changes, but that’s the price of faster performance.

Frame Blend

  • Duplicate frames are blended together
  • Not commonly used
  • Medium processing requirements

Frame Blend is generally not used that much because it creates blended frames, which are very noticeable, even to untrained viewers. If you’re going for that specific effect, Frame Blend might work. But for most situations, frame blending is probably not what you want.

Optical Flow

  • New frames are generated with motion estimation
  • Commonly applied to smooth out motion
  • Heavy processing burden

Optical flow can be used to smooth out the issues when converting between frame rates or changing speed. It can also make it possible to slow down footage beyond what the originally-captured frame rate allows. Optical flow actually interpolates entirely new frames, or new pixels on existing frame to fill in movement.

Resolve has various levels of quality for the optical flow process. With lower quality settings, your image might tear or start to ghost, which usually makes it unusable. But each increasing quality setting adds significant benefit (and processing load) to the end result.

Here is an example of optical flow warping.

Before warping…
…and after warping.

The first clip has not been re-timed, and does not have Optical Flow interpolation. The second clip has Optical Flow applied, and has been re-timed to 130%. Notice how the drummer’s right hand warps and tears from frame to frame. In the before example, the quality has been set to the standard, and only natural motion blur is present.

If you want to improve the result, you can change the various optical flow settings in Resolve 16:

Optical Flow Settings

Footage with a higher shutter speed (less motion blur) usually results in better frame interpolation. However, most of the time you have the footage that you have, which was likely shot with more natural motion blue. Few productions shoot regular speed footage with higher shutter speeds intentionally, but if unless you haven’t shot yet, and plan on severely re-timing footage and using Optical Flow, try a higher shutter speed.

Enhanced Optical Flow

In Resolve 15 and higher, there are more options for Optical Flow.

Click the gear icon and Master Settings from the options, and look at top and bottom options for Frame Interpolation. You’ll see Motion Estimation Mode and Motion Range. Motion Estimation Mode has four options, listed in ascending quality, with enhanced better producing the best result. Motion Range is a setting that dictates how much motion is happening within an image, which helps the interpolation produce more believable motion blur.

Of course, the higher quality the setting, the longer the processing time. So, if you use Optical Flow, it’s a good idea to use the render cache for those clips (so you don’t have to re-render every time), or to only enable those settings once you’re finishing or ready to export.

Speed Warp

In Resolve 16, Speed Warp is a new advanced option for frame interpolation with Optical Flow. This new option is processed by Resolve’s new DaVinci Neural Engine, which uses machine learning and AI to process new video frames. It sounds fancy, and it’s results look pretty great.

Speed Warp looks much better than the Enhanced Optical Flow, but is a lot more processor-intensive.

Here is an example of the same scene from above using Speed Warp instead of the standard settings. You’ll notice there is much less warping and way better motion blur interpretation.

After speed warp.

This new setting is set to be a game-changer for slowing footage down or complicated retime curves.

Speed Warp can really sell the illusion of ultra-slowed down footage, way beyond what your footage was probably intended for, or could believably achieve otherwise. Also if you have a complicated creative speed change using lots of curves and uneven multiples, Speed Warp can help hide any frame skipping or duplicating that might be visible.

Since Speed Warp is so processor-intensive, it might be a good idea to bake the effect in. It seems odd to render out a clip re-time as a new clip, but if you plan on using this effect a lot in a project, it can save you a lot of time.

Of particular note for this guide, Speed Warp might help with mixing frame rates together as well. While the older Optical Flow can help sometimes, it wasn’t dependable or consistent enough for a lot of professional work. But Speed Warp looks like it will greatly improve the ability to slow footage down and to mix frame rates together, so we’ll see how much use it gets for those types of projects.

Ultimately, though, it’s still important to understand how frame rates work in Resolve so that you can control the final product of your video. Having new tools like Speed Warp are helpful, but they’re of little use if not implemented appropriately. For example, in a professional setting, if you ignore our previous steps of organizing and matching mixed frame rates, and are just counting on Speed Warp to every other clip, this will probably slow down your project to an unacceptable level, especially on 4K+ footage. Just like every tool, we need to understand when to use it and when to use alternatives.

In the next edition of this series, we’ll discuss how to use all of these new processes and settings in a proxy workflow. Check out Part 4.

Dan Swierenga

Dan Swierenga is a professional colorist and Flame artist with over 10 years of experience in post production coloring and finishing many feature films, shorts, documentaries and commercials in LA and Chicago. He is the co-founder of the post production blog ThePostProcess.com, a site dedicated to teaching post production skills and techniques.