“Don’t worry, I basically spend most of the time looking at the scopes.”
I heard this back in the early 2000s in one of my first color sessions. The colorist and I were talking about the calibration of the screen in front of me, and I think they were hoping to instill some confidence that they “knew what they were doing” because they didn’t even bother to look at the image, they mostly worked off the scopes.
But is this really a good way to work? Let’s talk about what scopes do, why we use them, and whether it’s a good idea to rely on them entirely for your color work. (Spoiler alert; it’s not.)
What do scopes do?
At their most basic level, video scopes will analyze and give you data about what is going on with your footage. They help you assess objective elements of your images—like brightness and color saturation—to help you make creative subjective decisions more quickly, and with a greater understanding of how those decisions flow into the image pipeline.
Waveform Monitor
Resolve has several scopes built into the color grading software, and these provide most of the tools you’ll need for color grading your images. The first of these to understand is what is called the Waveform Monitor. This scope reads your image from left to right across the image, and then maps the brightness of that part of the image. It’s easy to see in this sample:
Across the image there’s a very bright sky, which you can see mapped to the top of the scope. To the right of the image, there’s a figure, and that figure creates a dark trace on the right of the scope, since its pixels are less bright.
You used to see these measured in IRE units (international radio engineer units), which were mapped from 0-100. IRE measurements under 0 or over 100 are considered “out of range” but are often still recorded by the camera and usable in the grade.
You can still set Resolve to show those units, but Resolve now defaults to show a 10-bit video scale from 0-1023, with the “broadcast safe” range being 64-940. (If you want to change it back to IRE, click on the ellipsis/kebab menu in the scopes panel and choose Percentage.)
You can also choose to use a Waveform Monitor in Parade view which shows three separate traces, one for each red, green, and blue. This is incredibly useful when balancing an image, since it can be very easy to see if an image is pushed too far in one color channel, or to balance out the shadows of an image. Parade has for many colorists become a default setting in their scopes.
The Vectorscope
The next tool you’ll depend on is known as a vectorscope. It shows a view of the color wheel with color targets for the main complementary color pairs of Red/Cyan, Green/Magenta, and Blue/Yellow. These points are also typically mapped on your color balls, so it’s easy to get an intuitive feel for how this scope works by moving your color balls and seeing how the image responds in both your video preview and on your scope.
In the image above, the trace shows you where your color palette is landing. You can, if you choose, turn on a “skin tone line,” which shows where most skin tone lines up, at various different brightness and saturation levels.
This can help in matching an image and correcting skin tones back to feel “normal,” though it’s not necessary to keep every single skin tone in your project precisely on that line.
The Histogram
Resolve’s Histogram view is in some ways quite similar to the Parade view of the Waveform Monitor, but works horizontally instead of vertically. It can sometimes be useful for analyzing the balance of your image.
It has generally been more popular as an onset monitoring tool and hasn’t gotten a major foothold in post production, but it’s there. (If you’re struggling with a tricky shot-to-shot match it’s worth trying the histogram to see if it gives you a different perspective on what you are seeing.)
CIE Chromaticity
The final scope to get a handle on—and the most complicated to understand—is the CIE Chromaticity viewer. This scope involves a trace drawn on a chip-shaped chart that plots all the colors visible to the human eye as set out by the CIE in 1931 (you’ll often hear this called the 1931 or the XYZ plot).
Within that chip there’s a triangle, a larger one if you are working in Rec. 2020 or Rec. 2100 or DCI-P3, and a smaller one if you are working in Rec. 709. The corners of that triangle are known as your “primaries,” which is the position of your red, green and bluepoints within the CIE Chip.
This scope is especially powerful as you evaluate your project for broadcast and to analyze where you’ll see clipping of your image (where the color values are outside the broadcast limits).
As you push against the limits of the triangle you’re working in, you increase the likelihood you’ll lose information to clipping in the broadcast or streaming pipeline. You can make the creative decision to push out to the boundaries if you like, but it’s best to know that you’re doing it before you choose to do it.
Comparisons and isolation
It’s important to note that Resolve’s scopes aren’t analyzing the underlying shot, they’re analyzing whatever is going on in your preview screen for output.
This means, for instance, that if you are using the Split Screen tool to compare two shots, the scope viewer will also show a split screen between the two images. This is incredibly useful when going through the matching process for a color grade.
If you’re struggling to match a shot to the master, you can often find the solution by looking at the comparison in the scopes.
Once you’ve set up a shot with an approved look, you can compare that shot to other shots. If you’re struggling to match a shot to the master, you can often find the solution by looking at the comparison in the scopes.
If a shot has a weird color cast in the shadows, or some other secret gremlin making the match difficult, the scope compare is often a great way to solve that problem.
This can also be incredibly useful when trying to dial in just a specific part of the image. By using any tool to isolate part of the frame—like the Qualifier—and using Highlight (Shift-H), you can show only a small part of the frame, and the scopes will update to show only that part of a frame.
If you have a key wardrobe or set piece, or product in a commercial where the client wants to ensure a very precise color, this can be an amazingly powerful tool.
Why are scopes so important?
Why is it so vital that we have a technical analysis of our video file? There are two main reasons; one technical, one anatomical.
The first is that we’re delivering these video files into a vast system of technologies that are highly regulated. When you take a video file and send it to a broadcaster for distribution, if your video file is out of regulation, it’s referred to as being “illegal.” So, if your reds are too saturated, it’s called an “illegal red.”
These regulations are designed to make sure that all the TVs across the country can display your image properly, and it’s taken very seriously by broadcasters. They have to evaluate the technical qualities of the video to ensure it meets regulations, so you should as well.
The second main reason we depend on scopes has more to do with human physiology. The human visual system is very good at adapting to a given environment. If you sit in a dark room looking at a screen for a long enough period of time—color sessions often run eight hours or longer—you’ll lose perspective on what something actually looks like.
For instance, let’s say you are grading a project like Dune, which has relatively washed out black levels, sometimes called lifted blacks. Looking at this image again from when we discussed the waveform, you can see the black level of these dune images is up around 128 on the scale, or near 10IRE.
Because of the way your visual system works, gradually that is going to start looking normal to your eye. By the time you’ve been grading for four hours, you’ll have lost perspective on what looks correct.
If you’re only grading by eye, then this can result in you going heavier and heavier on an effect as the day goes on. You start the day at 10am knowing the vision is “lifted blacks” and set them around 10IRE and it looks nicely washed out to your eye. By noon, 10IRE blacks look “normal,” but you know the vision is “lifted blacks,” so you start grading your blacks up around 20IRE. By 2pm, you’re setting your shadows at 25IRE, because your image still looks washed out according to your eyes.
So we use scopes, along with other tools like reference images and the highly effective “going for a walk outside the color suite,” to help reset our sense of what is normal.
You can’t take a walk after every shot, but you can keep looking down at the trace on the waveform and saying to yourself “alright, here we are, back at 10.” Scopes give you some to orient yourself against, which is difficult to do in any creative field, especially filmmaking.
Internal vs. external scopes
In the early days of color grading, all scope and image analysis tools were external. You took an SDI signal out of resolve and plugged it into a physical, hardware video measuring device that measured the video signal and analyzed it.
This is still the gold standard. At the highest end post production facilities you’ll still see banks of physical scopes being used to analyze an image and ensure its readiness for broadcast. Blackmagic themselves, who sell DaVinci Resolve, even sell a set of hardware scopes designed to be used in combination with Resolve as an outside analysis tool.
As scopes have become increasingly software tools, instead of hardware, we’ve started to see an increasing pace of innovation coming into the space. This started with the launching of scopes from within software like DaVinci Resolve which meant less reliance on external devices.
You can even get external software tools, like Scopebox, that are designed to be installed on a dedicated second computer (often an iMac) to analyze the video signal in real time with a host of tools.
Can you really just use software scopes?
The answer to this is, almost definitely, yes. There are strong technical arguments to be made that you should never deliver any project to any broadcast network without evaluating it through proper hardware scopes. The purely correct technical answer is “use hardware scopes.”
However, in reality, countless hours of television and even theatrically finished movies are graded without the use of hardware scopes every single day. The overwhelming movement of the industry towards smaller color and finishing houses with less technical overhead have moved us there whether it’s the “correct” answer or not.
The largest projects of course still have the full technical infrastructure, and whenever you have the budget or other means to access those tools you should take advantage of them. But software scopes have grown to such a robust point that you can be confident in the technical aspects of your images from within the application you’re using.
QC is what you should be worried about
All of this is leading up to a step called QC (quality control). In order to deliver your project to broadcast networks and most of the major streamers, you need to pass a QC checkpoint. This is performed by an outside QC house, and it cannot be performed by the same company that did the post work—that would be like a company regulating itself, which never ends well.
QC is usually a complicated process and it’s often time crunched.
QC is usually a complicated process and it’s often time crunched. There’s a delivery date you need to hit and you’ve sensibly factored in plenty of time for QC. But then the edit runs long, then the sound mix, then you’re given pickups. And now your QC block is looking a lot tighter. More than once in my career, the company has paid for an overnight QC that involved meeting someone at midnight to deliver a file to get a QC report by 8am.
What’s the big deal with QC?
The QC report will have notes. Lots of them. These will feature deep technical detail, much of which will come from reading scopes.
Your goal is always to pass QC the first time, and avoid the cost of fixing problems, which can be impossible if you’re really crunched for time. To that end many post houses now employ automated QC checking tools to evaluate their project in-house before it goes to the outside QC vendor. The idea is to fix as many problems as possible before the hassle of coordinating with outsiders. And this is where scopes really come into play.
The primary things you want to watch the scopes for in terms of QC are illegal video and chroma levels. This is when you have colors, especially reds and oranges, that are both very bright and very saturated. For example, if you’re worried about a particular red kicking back at QC, you can grab it with a secondary and pull it in a bit to bring it into submission.
This brings us back to the image we pulled from the Barbie trailer used earlier. Let’s look at it in more detail. This image is clearly intended to be one of robust, almost overly intense color. If you look at four common scopes, you can see in your Waveform view a good spread of brightness levels, but nothing worrying QC.
But then we look at the Vectorscope and see the color trace in the red area is outside the graticule, which shows you that you could potentially have a problem.
But it’s the newer CIE chromaticity chart that really puts a giant red flag on the issue, showing that the combination of brightness and saturation is pushing out at the borders of what can be shown in the color space chosen for the project.
This is the kind of shot that sometimes goes back and forth between a QC house and a production, and might include something in the QC report like “creative intent is for the color to be this hot.” This is also something that is easier to navigate as we move towards bigger color spaces; this project is set up for Rec. 709 for teaching purposes, but here it is again with a larger color space (Rec. 2020).
With Rec. 2020 you end up with a bigger color space, which you can see in the larger spread of colors. Look at the yellow section to see the greater detail Rec. 2020 allows. However, even in 2020 you still end up having to clip some of that magenta detail. It’s an intense original image, and might suffer in broadcast.
QC will also flag things like moire, that dancing sensation you often get from the interaction of the pattern of pixels and the patterns of clothing. A slight blur or grain can often fix that up in the color suite, though the scopes usually won’t be much help in flagging moire while you work.
When should you look at your scopes
So, when, precisely, do you look at your scopes? If you’re using a color panel or other interface that allows you to not need to look down at the controls, you’re probably looking at your image on your reference monitor somewhere between 80-95 percent of the time and your scopes around 5-10 percent of the time.
This time likely follows something like a U-curve, with the most time spent looking at the scopes at the start of your grading process, less time in the middle, and then more again at the end.
A typical session on a commercial might look something like this. As you start your grading, you’ll be looking down at the scopes all the time, starting with the initial balance of your shots. You’ll look back and forth, scope to image, image to scope, getting things dialed. Then you’ll look as you work on matching the shots to each other.
Gradually this fades away as you spend more time doing the overall grade of the project, looping over and over and polishing the grade. You might go for long periods of time without ever looking down at the scopes at all during this phase.
You might go for long periods of time without ever looking down at the scopes at all during this phase.
Finally, before you finish, you’ll do a few loops where you are looking at the scopes constantly, watching out for any major surprises, monitoring for anything you think might flag at QC.
Yes, you can put on a “broadcast safe” filter if you want, but those are never perfect, and giving a quick look for any colors, especially reds, that are coming on too strong in the vectorscope will make your QC process a lot easier into the future.
Not safe to drive
So, all those years ago, was the colorist right to say “Don’t worry, I mostly look at the scopes?” Nope, that was insane. You don’t “mostly look at your scopes” any more than you can drive “mostly looking at your speedometer.” You mostly look at the image, and use the scopes to help you understand what you are seeing, and keep your eyes fresh.