Timecode and Frame Rates: Everything You Need to Know

If you grew up in the digital age, timecode might seem like a relic of broadcast TV and celluloid film. But it isn’t.

Timecode is a way of recording exactly what happened when. From notes on set to the cuts in the edit to the dialogue, music, and sound effects.

Think of it as a reference tool. When you browse the web, you wander from page to page. Most of the time, you don’t pay much attention to the URLs because you’re focused on the page itself. But when you want to remember a page, or share a page with someone, or communicate about a page, then you need that URL which records the exact location. Timecode does the same for video frames.

For example

Suppose you’re shooting a music video and you’re on the 15th take of one shot; all the takes look identical but there’s this one, perfect moment. Will you remember that spot after shooting 30 identical takes? Wouldn’t it just be easier to have an exact timestamp of the moment? That way you could immediately find it when you’re in the cutting room.

If you’re using Frame.io to get detailed feedback and your client suggests shortening a cut, their comment appears on your timeline in precisely the right place. You know it’s the right place because of timecode. Timecode runs behind the scenes much of the time, but it’s the glue that holds everything together.

It also, crucially, helps keep the sound and the picture in sync during the shoot.

But why can’t I just sync everything in post?

Our digital tools are so sophisticated that it’s become routine to think that everything can be fixed in post. If you’re an editor, you know how false that premise is. You can’t fix everything in post—it just isn’t possible. And when you can, it’s often very expensive.

Yes, if your clips aren’t too long, there are applications you can use in post to sync audio and video, including FCP X’s built-in ability to sync audio and video, Premiere Pro CC,  Red Giant’s PluralEyes, or even Audacity. But the programs you’re using in post aren’t looking at anything other than digital references to the material. So where you see video and audio that aren’t synced, the computer only sees timecode. It has no idea that there’s a problem. It’s even worse if the audio and video have been matched up by error but are totally unrelated.

Now suppose your non-synced audio and video footage weren’t recorded at the same frame rate. Now your timecode doesn’t even run at the same rate between the two sources.  What if your source material doesn’t have slates on top of that? You see where this is going.

With timecode, you might know what the obvious problems are right at the beginning. But most of the time many issues only become apparent after you’ve done a fair amount of work. In which case, you end up having to redo what you already did or throw it out and start again. That’s a lot of lost effort, time, and money that you have to make up. Those deadlines aren’t moving, and those renders can’t go any faster.

Not without its problems

A basic search online for “timecode problems” shows that it’s not foolproof. You can run into issues both with timecode you thought was fine but isn’t. Or with timecode that’s fine but where other issues come up. Which can be caused by transcoding or unknown bugs with the programs you’re using.

For example, QTChange used to use a value of 23.98to calculate its frame rate and timecode instead of 23.976. And this caused problems for its users. Premiere Pro also had an issue with timecode because it was round-tripping start timecode with audio files. Those programs are written by human beings who can make mistakes and can’t think of every possible situation.

Remember that timecode is native to your files, so bad timecode is basically like a genetic defect.

You should regard sync in post as an absolute last resort. Instead, plan for the timecode you record on set to be as perfect as possible.

If you’re not on set but involved in production prep, talk to whoever’s handling the cameras. Ask them about what settings they’ll be using, and make requests if you can.

If you weren’t even in the pre-production prep, talk to the crew. Find out who shot the material and get as much information as you can about the camera settings. You may not be able to prevent recording issues, but you’ll be better equipped to fix any problems you encounter.

After reading this article, you should be able to set up your timecode to be as trouble-free as possible. Or at least understand what problems you might run into depending on how the footage has been shot. None of this guarantees that post will be problem-free but it should definitely reduce the number of issues you’ll encounter.

What is Timecode?

Timecode is a way of precisely labeling all the frames in a recording so you can know the exact location of a recorded point.

The primary timecode we’ll concern ourselves with is SMPTE. This was developed in the 1960s by the Society of Motion Picture and Television Engineers. Hence the acronym. SMPTE records both audio and video signals.

Timecode is used to synchronize and reference all types of audiovisual media—video files, sound files, captions, visual effects, and more.

A normal timecode/SMPTE display reads HOURS:MINUTES:SECONDS:FRAMES

Hours, Minutes, and Seconds are numbered like they are on a regular clock. Seconds can be divided up into frames, which are single images and the smallest increment you can have in timecode.

By giving every frame a unique identifier based on the length of the recording or the time at which it was recorded, you can find any specific frame in a recording if you have its timecode reference. But timecode can only count in whole rather than fractional frames.

Depending on the project you’re working on, there are two primary ways of running the timecode: Free Run and Record Run.

Free-run

Free Run Timecode is used when you want to know the actual time at which events that are being recorded occurred, so it’s useful on a documentary or on music or sports events that last a few days. In Free Run, the recording device runs timecode continuously whether you are shooting or not. The timecode runs at the frame rate set on the device. It’s common to set the clock to the time of day, but it’s important to remember that because the timecode counts in frames and regular clocks do not, there may be drift between the camera’s clock and the actual time of day.

Free Run is now commonly used when you have multiple cameras and audio recorders so you don’t have to worry about stopping and starting all your devices at precisely the same time.

Record run

In Record Run, the recording device runs timecode only when the camera is recording. It counts only the frames recorded so that the total runtime is a reflection of the total usable footage. Record Run is used less frequently these days but is a good option on shoots with a single camera and audio recorder. Many digital cameras simply output TOD (Time of Day) timecode and do not have a Record Run option.

Once you know if the timecode is going to be in Free Run or in Record Run, the next step is identifying what frame rate to use.

Picking the Right Frame Rate

Different parts of the world use different timecode frame rates. The most common ones are:

  • 24 frame/sec (film, ATSC, 2k, 4k, 6k)
  • 25 frame/sec (PAL, used in Europe, Uruguay, Argentina, Australia), SECAM, DVB, ATSC)
  • 29.97 (30 ÷ 1.001) frame/sec (NTSC American System (US, Canada, Mexico, Colombia, etc.), ATSC, PAL-M (Brazil))
  • 30 frame/sec (ATSC)

But…why?

Why the different frame rates around the world? Because originally television was only live and not recorded, so the only way to ensure sync between studio cameras and home TV sets was to sync the signal to electrical mains. Mains were 60Hz (30fps) in the US and 50Hz (25fps) in Europe. But television was originally in black and white, so when color came along, SMPTE engineers wanted to keep the color signal backward compatible with black and white TVs. To do that, they had to “slide” a color signal between the existing black and white signal and slightly alter the frame rate from 30fps to 30/1.001 = 29.970 fps to avoid artifacts, thereby creating the NTSC color standard.

Because the smallest increment in timecode is a whole rather than a fractional frame, timecode running at 29.97fps cannot account for the 0.03 frames that are missing in every second; so a device running timecode at 29.97fps runs a little slower than a normal clock.

It’s important to remember that, even though they are related, timecode and frame rate are not the same thing. Timecode is a way to label frames in a recording and frame rate is the speed at which images have been recorded or are played back.

Since the NTSC standard had a peculiar frame rate, a special kind of timecode had to be invented so that editors would know how much real time had passed with a simple frame-labeling method. This is how Drop Frame Timecode was created.

What is Drop Frame Timecode?

Remember that the NTSC frame rate is 29.97fps instead of 30fps, which means that .03 frames are unaccounted for every second. Since timecode can only count in whole frames, after an hour there should be 30fps x 60sec/min x 60min/hr = 108,000 frames. Because NTSC is 29.97fps, after an hour there will be 29.97fps x 60sec/min x 60min/hr = 107,892 frames.

So there’s a discrepancy of 108 frames in NTSC, which means that after one hour of real time, the timecode on your recording would be behind by 3.6 seconds (108 frames/30fps = 3.6sec). The timecode count would be 01:00:03:18.

How does drop-frame work?

Drop Frame Timecode works by dropping two frame numbers from each minute except every tenth minute. Your recording is unaffected because it drops frame numbers, not actual frames! Because it drops those numbers, at one hour in real time, your timecode will increase by exactly one hour.

It looks like this:

Logically, you use Drop Frame (DF) timecode when you shoot material at 29.97fps or 59.94i (59.94 interlaced) because it’s meant for TV broadcast. The general confusion around all these identical-looking frame rates means that sometimes people still refer to this as 30fps or 60i even though that’s technically incorrect. If you look back at the various frame rates and the standards they apply to, the only one left that is 30fps is ATSC, which is compatible with 29.97fps / 59.94i. 30fps and 60i are uncommon but unfortunately, some recording devices do record in those formats, so it’s important to make sure that your frame rate is exactly what you think it is.

How does non-drop frame work?

Non-Drop Frame Timecode is straightforward: for every frame of recording, a timecode frame is recorded. The ratio of frame recording to timecode count is 1:1.

Why am I hassling you with calculations? Because the more familiar you are with what different kinds of timecode look like, the more quickly you’ll be able to tell if there’s a problem with your source material.

When you make a decision about what timecode to use, you have to take the circumstances into consideration. For example, if you’re shooting a long event in Free Run because you’re keeping a log of the real time at which different things happened, and you’re shooting at 29.97fps, you’ll have to use Drop Frame timecode so that your timecode clock stays in sync with real clocks.

If you used Non-Drop Frame timecode, you’d be behind by 3.6 seconds every hour, which means that after 24 hours your timecode would be 24hrs x 3.6sec/hr  = 86.4 seconds behind—nearly a minute and a half!

Mix and match

You can use Non Drop Frame timecode for most other situations that don’t involve the 29.97fps NTSC standard.

If you’re using multiple recording devices on the same shoot, make sure they’re all set to the same kind of timecode—DF or NDF. Any situation in which you mix them will result in massive post headaches for the simple reason that nothing will line up and you’ll be spending your time trying to sync footage that has no matching timecode. That may not seem like a big deal if you have slates, but it still is; think about the number of takes you’d have to manually sync on a whole show.

Occasionally, when using multiple devices that you can’t sync while recording, you might have to identify the weakest link in the chain and adjust your timecode settings on the other devices to match the weakest one. Your aim is to have timecode that counts in the same increments across your recording devices. Camera timecode that doesn’t match clock time may be annoying, but camera timecode that doesn’t match another camera can be a nightmare.

Drop Frame and Non Drop Frame timecode are relatively straightforward once you know how they came to be and what you use them for. But what about 23.98 vs 24fps; what’s the story there?

23.98fps vs. 24fps

We’re all familiar with the 24fps standard because we’ve all seen movies made on film. The idea that 24 frames go into a second of filmed material is so ingrained as to probably cause major confusion for people getting into post.

Movies were shot on film at a rate of 24fps but video was/is broadcast at 29.97fps (NTSC Standard). In order to properly fit the 24fps of film into a 29.97fps video signal, you have to first convert the 24fps frame rate into 23.976fps.

So 23.976fps, rounded up to 23.98fps, started out as the format for dealing with 24fps film in a NTSC post environment.

23.98fps now exists as a standalone HD video format. But logically it’s only used in NTSC countries so you wouldn’t find it in a PAL 25fps country.

Just to get an idea of the numbers, with a camera shooting in Free Run at 23.98fps, the drift will also be 3.6 seconds after one hour of real time so the timecode count will be 01:00:03:14. (0.6 seconds x 24 frames a second = 14.4 frames).

Though it would be nice to be able to compensate for the 0.024 frames unaccounted for every second, there is no Drop Frame standard for 23.98fps because there’s no amount of frame numbers that can be dropped from the timecode to make it fit neatly into real time. We got lucky with 29.97, but it just doesn’t work with 23.98.

External sound recorders

When using an external sound recorder, it too must have 23.98fps as an available choice, or there will be drift between the sound and picture.

Many older sound recorders only have real-time timecode clocks or can only do 24fps timecode. If that’s the case, then you should probably shoot at 24fps instead of 23.98fps to help keep the audio and picture timecodes in sync.

If you have to shoot video at 23.98fps but your sound recorder can’t record at 23.98fps, there is a workaround that I’ve included at the end of the article.

Whatever you do, be consistent. Don’t mix footage with different frame rates.

If possible, check what your post options are before you shoot to optimize your post time instead of spending a lot of time and money on conversions.

If you’re not part of the production before post, try to find out everything you can about how the material was shot before you do anything with it.

Let’s look at some of the physical tools available so that if you have any say over what’s happening on set, you can ask the proper questions, make the right kinds of requests, or even operate the recorders yourself. If you don’t, you can still ask if these tools were used as long as you understand what they are.

True Sync: Timecode In/Out and Genlock

Again, timecode is a way of labeling frames in a recording. When handled properly, it can be used to sync devices while shooting but that’s not its primary purpose. The terms sync and timecode are often used interchangeably, but timecode alone is not a reliable way of maintaining sync between devices.

Why not? Because many cameras and audio recorders have quartz crystal or similar kinds of clocks that are neither highly precise nor consistent across devices. This means that two different cameras may count a second slightly differently. It doesn’t take long, sometimes as little as half an hour, for two devices to drift apart enough to have visible sync issues. This quickly becomes a problem considering one frame of drift is enough to notice a lip-sync error.

It’s less significant with a single camera (although still possible between your camera and audio recorder), but can be quite dramatic when using multiple cameras.

The timecode in/out port

Most professional cameras have a timecode in/out port. To connect multiple cameras through their camera in/out ports, you first have to set them to “Free Run” timecode. The simplest way to basically sync two cameras is to connect the primary camera’s timecode out port to the secondary camera’s timecode in port. But remember what we just said about drift between different devices? Once both cameras start running, the secondary camera will run timecode however it normally does, which is why you can still have drift between the two cameras. This is where Genlock comes in.

Just a note before we start on Genlock: Some of the newer cameras don’t have timecode in/out ports. If they don’t and you need to record sync sound on a separate audio recorder that has a timecode out port, you can record the timecode output from the audio recorder as an audio signal on your camera’s audio tracks. While these won’t line up perfectly with your video recording, they will be easier to match up in post than if you leave the two devices running timecode independently of each other. Remember this in case you know the cameras being used don’t have TC In/Out ports and the camera operator isn’t familiar with this trick.

What is Genlock?

Genlock stands for generator locking. It is sometimes referred to as Sync Lock. Genlock sends out a regular, metronome-like  “beat”  to all of the cameras. The cameras use that signal instead of their own internal clocks, preventing possible drift from unreliable clocks. So Genlock synchronizes frames. Timecode by itself doesn’t synchronize anything–it’s a reference tool used to sync material in post after the recording devices have been synced with Genlock. Hopefully, that difference is now clear.

By synchronizing frames, Genlock keeps multiple devices from drifting apart. So with multiple cameras, audio recorders, etc. for true sync, you have to use both the timecode in/out ports and the Genlock port. All the recording devices have to be fed a common timecode source and a sync (beat/pulse) source that is locked to the timecode. Since the sync and primary timecode are locked, the cameras can’t run their own slightly different timecodes and there can’t be drift.

The downside is that all the recording devices have to be hardwired to the sync and timecode source. With multiple cameras, that can mean a lot of cables and a lot of camera calibration if you’re using different length cables. This isn’t a problem in a studio environment but it can be very tricky when shooting on location (which is a good reason to double-check what was done if you weren’t on set). The best scenario is to connect each camera to a reliable sync device such as an Ambient Recording Lockit Box or a Sound Devices recorder with sync output.

Jam-sync

You might be wondering why I haven’t mentioned jam-sync. Jam-sync refers to a situation in which you have a primary video camera and a secondary audio camera. If the primary camera has a timecode problem in which there is a drop-out, the secondary camera fills in with timecode. So jam-sync isn’t true sync if there’s an issue with the secondary timecode relative to the primary device’s timecode. Each camera is still relying on its own internal clock, so cameras that are jam-synced have to be re-jam-synced frequently, otherwise their timecode starts to drift apart.

It’s important to find out if TC In/Out Ports and Genlock were used on a multiple camera shoot if you weren’t on set because you’ll know right away to expect drift between the cameras.

If it’s impractical to hardwire all your recording devices to your sync and timecode source, you can try to feed timecode to your recording devices using a wireless audio transmitter. And if you weren’t on set while wireless timecode was used, here’s what you should know about it before you start working on the material in post.

Wireless Timecode

Timecode such as SMPTE 12M LTC can be passed as an analog audio signal. This makes a specific, recognizable sound when played through speakers. A number of companies make apps and other products that promise fantastic wireless timecode and sync. It’s a developing field that will hopefully become more reliable as wireless networks become more reliable.

In the meantime, it’s good to remain slightly skeptical for the following reasons:

  • As we all know, wi-fi isn’t foolproof and there can be signal loss when audio or video is sent wirelessly.
  • More importantly, if the timecode signal drops for any reason, the camera will revert to its internal clock. This may cause any of the drift problems discussed above.
  • If there are only small signal drops, the problem may not get noticed until it’s too late to do anything.
  • If anything goes wrong, troubleshooting over a wi-fi network without timecode hardware may make any already existing problems worse.

It’s worth noting companies like Ambient. They make increasingly solid and reliable wireless hardware based on temperature-compensated crystal oscillator technology (TCVCXO). This drifts by less than one frame per day of shooting.

Again, if you’re on set, you have to weigh your options and decide what’s best for you under the circumstances.

Help yourself

To be able to help yourself before post or in post, it helps to know the following:

  1. The difference between timecode and sync.
  2. What frame rate will be or was used.
  3. How to best match your video frame rate to your audio frame rate. Can you do this on set? Or do you need to convert some material in post?

And you need a reliable timecode source to avoid drift on set. In other words, don’t rely on your devices’ internal clocks, and understand how to connect your timecode in/out ports. This also means using Genlock for true sync.

In post: ask if one was used, which one was used, and if Genlock was used as well.

On set: hope for better wireless timecode ASAP, since more and more prosumer cameras don’t have a Genlock port.

In post: ask if wireless timecode was used. Particularly if the material was shot with multiple cameras on location.

Addendum on shooting at 23.98fps

If you have to shoot video at 23.98fps but your sound recorder doesn’t offer the option of recording at 23.98fps, the closest thing you can do is record sound at 29.97fps Non Drop Frame if possible. You’ll still have drift between video and audio, but it’ll be smaller than if you run the recorder at another frame rate. Or according to a real internal clock.

Here’s a breakdown of how this works:

At whole frame rates

Shooting at whole frame rates means that you are unlikely to experience major drift or issues with your timecode sync. So if you shoot video at 23.98fps, over an hour your picture timecode will drift by 3 seconds 14 frames. But recording audio at a whole frame rate like 24fps or 29.97fps Drop Frame will match real-time with no drift.

  • Video timecode after 1hr @ 23.98fps = 01:00:03:14
  • Audio timecode after 1hr @ 24fps or 29.97fps DF = 01:00:00:00

The difference/drift between your video and your audio timecode will be 3 seconds and 14 frames.

At Non-drop frame rates

If you shoot audio at 29.97fps Non Drop Frame, the drift will be 3.6 seconds as we said above. (0.6 seconds x 30 frames per second = 18 frames)

  • Video timecode after 1hr @ 23.98fps = 01:00:03:14
  • Audio timecode after 1hr @ 29.97fps NDF = 01:00:03:18

The drift between your video and your audio timecode will be 4 frames.

It’s not perfect, but it’s as close as you can get without shooting at the exact same rate of 23.98fps.

Hilda Saffari

Hilda Saffari is a media and technology consultant with experience in feature post-production and telecommunications. Her feature credits include the first 3D digital intermediate (Spy Kids 3D), Terminator 3: Rise of the Machines and Master and Commander: The Far Side of the World.