Free White Paper: Preserving Your Creative Intent in HDR
Editor’s note: Thanks to our friends at Colorfront for contributing this article and sharing their in-depth research with our community. Download the full white paper to get even more technical details on HDR workflows.
Just as we made the leap from standard definition to high definition television years ago, many of us are now navigating the move from SDR (Standard Dynamic Range) to HDR (High Dynamic Range) workflows.
With HDR, creators are now able to bring a more accurate version of their original vision to screens of many sizes, and audiences are able to enjoy images that are more vivid and lifelike.
But maintaining a consistent creative look across today’s rapidly growing field of new display technologies for television, cinema, and portable devices is a huge challenge. This is especially true when considering the wide disparity in possible brightness, contrast, and viewing environments.
With the wide array of digital cameras and SDR/HDR deliverable options, your goal as a filmmaker is to create an effective workflow that’s consistent through every stage of the process, from acquisition to delivery.
So what kinds of new and innovative technologies are available to help you achieve this more easily? In today’s article, we’re diving into Colorfront’s research on perceptual color processing to explore how these tools can help filmmakers.
Preserving creative intent
There are a lot of highly technical white papers our there about HDR. And for good reason.
HDR is a deeply complex subject, worthy of a whole separate article (or a series of them). For now, here’s a simplified starting point:
As the name implies, High Dynamic Range images have a much wider dynamic range (that is, the difference between dark and light) and a much larger gamut (that is, a broader color palette). This lets creators deliver the highest quality and most accurate representation of an intended look.
Directors and cinematographers take great pains to “set the look” for a film or a show, and perform extensive testing with the DI (digital intermediate) colorist prior to principal photography to ensure that their look can be carried all the way through to the final deliverables.
This creative planning and technical preparation is what we mean by preserving the creative intent. There’s so much to consider, it’s one of the most challenging aspects of creating a properly color-managed workflow that’s easy to adopt—and apply—from beginning to end.
To capture the final desired look, it all starts with the DP shooting source footage, which then goes through various image transforms, and is put up on a reference master display. This becomes the hero image, and is what all deliverables are intended to look like, no matter their viewing environment, without requiring individual adjustments.
In a perfect world, a reference master display should be the superset from which all subsequent deliverables are derived. But that’s where the complexity gets in the way.
Single Master Workflow
Let’s start with an example.
A typical LUT-based workflow for TV can only address a single output type, which has commonly been SDR. A different set of LUTs are required for HDR, which often don’t match the SDR look exactly.
Ideally, both SDR and HDR looks would match in all aspects, except for the increased dynamic range and wider gamut of the HDR image. However, actually achieving this is difficult. Or it used to be.
This is the essential problem our Single Master Workflow aims at solving—to maintain the same visual substance from initial look, through grading, and onto the final deliverables.
The Colorfront Engine is a state-of-the-art parametric color processing pipeline that maps various input formats, including camera original (scene-referred) and graded (display-referred) images to a wide range of SDR and HDR output formats at user-definable brightness levels and gamuts.
The backbone of the Single Master Workflow is the perceptual processor framework we’ve been refining for years.
By understanding how the human visual system works, we’ve built an internal processing color space where the perceived color and tonal relationships are preserved. That allows the system to process source content to fit a given display’s brightness and surroundings, and then correctly maintains the original creative intent in the new environment, with little to no manual adjustment.
In other words, it’s designed to convert between a variety of original camera and delivery formats, no matter the color space, brightness level, or dynamic range, and enables seamless:
- Mapping from various camera-native log formats to multiple concurrent HDR/SDR outputs
- Previewing SDR/HDR on a single HDR display
- Down-converting HDR to SDR
- Up-converting SDR to HDR so that legacy SDR content can be integrated into HDR workflows
- Remapping between various HDR and SDR nit levels
- Remapping between standard and wide color gamuts, including Rec. 709, Rec. 2020, DCI-P3, and ACES
- Converting between various resolutions ranging from SD up to 8K
In this process, the image comes in through an IDT (Input Device Transform) to our common grading space. Then, our ODT (Output Transform Device) processes the hero look to all other required output levels. This approach also tunes the brightness and surrounding parameters to compensate for dark or bright environments.
From an R&D standpoint, it’s terrifically complicated. But from a usage standpoint, this is a turnkey way to make HDR workflows more accessible for end users.
So what does this mean in practical terms?
As part of our Colorfront Express and On-Set Dailies, you can now easily connect your camera footage from the set to Frame.io. The footage can either be log HEVC files or Original Camera Negatives (OCN) such as ARRIRAW or Sony RAW files.
Once inside Express Dailies, you can fully interact with files inside Frame.io, easily sync audio, and apply the Colorfront Engine to derive both SDR and HDR deliverables for dailies or editorial.
The 10-bit HEVC dailies files can be loaded into Frame.io in addition to the OCN log files. This means that Frame.io users can view in both SDR and HDR, depending on the viewing device.
And now, with the HDR playback feature in Frame.io for iOS, anyone who needs to view the footage can do so on HDR-compatible mobile devices like iPhone or iPad, and can even Airplay to an HDR-compatible television.
The future is bright(er)
We created Colorfront to solve one of the biggest problems for filmmakers. Historically, getting dailies meant waiting, all because it took time to ensure the original creative intent was being captured.
In the era of celluloid, it could take days. Until we just recently, even file-based workflows were still dependent on physically moving media from place to place. But now you can get your dailies instantly from set, no matter where you are.
If you look back at some of the major breakthroughs in the industry—film to tape, tape to digital, and now cloud-based workflows—what this really means is that our technology allows us to close the gap between production and post-production. We’re enabling filmmakers to see what they’re creating faster and more accurately, and to be certain their creative intent is being maintained from capture to delivery.
Of course, there are many challenges to building a color processing pipeline that’s truly ready for mastering large-scale projects. And, as is always the case in the film and television industry, we’re always learning more and optimizing technology accordingly.
That’s why our work on perceptual color processing, and our partnership with Frame.io C2C are so exciting. We can’t wait for filmmakers to try out these robust, empowering solutions.