Remote color grading session

Workflow From Home: Episode 9 – Live Remote Color Grading

We’re already up to Episode 9 of our Workflow From Home series, and one of the things that’s been particularly gratifying for us is getting all your feedback and suggestions.

So this episode was created for all of you who’ve been asking about how you can do real-time collaboration.

We’ll show you how it’s done by performing an actual synchronous color grading session from opposite sides of the continent using Frame.io’s integration with Resolve, along with our latest app, Frame.io Transfer, which we decided to release ahead of schedule as a beta specifically to help enable this kind of workflow.

Synchronous and asynchronous review

When it comes to how creative teams interact with each other, there are essentially two categories. The first is synchronous review.

This is how we’ve been accustomed to working: we’re all in the same building or same room, talking to each other, physically interacting with each other, and collaborating in the same time and space with several key stakeholders.

Directors heavily rely on synchronous review during edit sessions, VFX review, sound spotting sessions, or color correction sessions. Synchronous review is especially important in animation, where dozens of animators or lighters are touching the same character and need feedback from the director, who often acts out how he wants a character to look or behave.

Essentially, synchronous review is critical for creative teams because it’s the only way to keep a single vision on track when there might be hundreds of independent minds influencing the final product.

The other type of collaboration is asynchronous review.

This is what we’ve talked about in previous episodes, where collaborators can upload assets to the cloud from an NLE, or effects tool, or color corrector, so that stakeholders like a director or cinematographer or producer can comment from a remote location.

Frame.io is an example of an asynchronous review tool and it’s where creatives have a chance to review material on their own time, make detailed notes, mark-up the frame, and share their notes back to the team.

But now we’re in a world of remote review—which means everyone wants remote synchronous review. And, as it turns out, there really isn’t a stand-alone clear solution.

The best opportunity for an efficient remote synchronous review is to combine the ideas of synchronous review and remote collaboration together, so that people can accomplish the collaborative process they’re used to having while being in a remote setting.

But haven’t we already done this before? Well, a common example has been deployed by top post-production facilities in what is known as “remote grading sessions.”

Remote grading sessions are extremely useful for a director or cinematographer—especially when they’re shooting episodes of a series in one city and color correcting a previous episode is happening in another city.

The technological challenge here is that you need to have a high quality transmission, very little latency, a way to orally communicate, as well as two color calibrated environments that are identical in display technology and calibration specifications.

When this works, the tolerance between these two environments is virtually indistinguishable, and even though it’s remote, the efficiency of the session might be greater than 90 percent as compared to an in-person session. So let’s put a pin in that…

The reason remote review sessions are successful is because they are set up and maintained by the same post vendor with the same display technology in two different places. In other words, post houses remote to themselves, from themselves. Most of the time this is accomplished with several hardware devices including the same display, same calibration practice, and a specialized hardware transmission appliance.

But there’s our problem: post vendors can’t come to our homes and set up appliances. So we don’t have the necessary infrastructure in place for remote synchronous review today. Or do we?

Outlining the problem

  • We need to build a workflow in which the sender (an NLE or color corrector), can output real time images from their computer.
  • It needs to support a secondary external display (that’s an important one that often gets missed).
  • It needs to display 24, 25, or 30 frames per second.
  • It needs zero latency audio.
  • It needs to be at least HD resolution.
  • There needs to be a way for both sides to hear each other.
  • It needs to be high quality enough to judge clear color images.
  • It may need to support HDR.
  • It needs to be encrypted.
  • And there needs to be a trustworthy device on the receiving end to watch it on.

So how do we do that? And why does that seem so hard?

I have a theory. The problem is that the market for a high quality solution like the one we just outlined is, in actuality, relatively small. But all of a sudden, the need for this particular solution is relatively large.

Software solutions like Zoom or Go to-Meeting or even Frame.io scale easily in accordance with demand. But a lot of the problem we just outlined lives in hardware. Monitors, calibration, encoding—none of those can really be easily downloaded and therefore they can’t be rapidly scaled across users…or can they?

The first thing to point out is that even though the market is small, there are some remote synchronous collaboration tools out there, but each one has different variables that you need to research. Premium tools such as Sohonet Clearview, TVIPS, or Streambox are great hardware appliance technology options you should explore.

What people are asking about are low cost alternatives that are not only easy to set up by yourself, but also avoid hardware appliances and allow creatives to collaborate with no trade-offs.

The community groups that have reached out to me about this the most often are colorists and cinematographers. That’s why I wanted to talk directly to all of you. “How can a colorist set up a remote synchronous live color session with a DP from their homes?” It has to be high quality—we’re judging DI after all—and it can’t leverage a dedicated encoding hardware appliance. In other words, we need a software solution—and we need it yesterday.

Frame.io Transfer

Let’s start by creating a use case in which this would be necessary.

Imagine we have a one-hour network episodic series with 10 episodes that need to go through DI.The series was shot on the Alexa Mini using 3.2K ProResXQ files, captured in LogC. We have a colorist working from home with a Resolve 16 Studio setup and we have a DP that wants to collaborate on a remote synchronous review color color session from their home.

Back in Episode 2, when we discussed air-gap editing, we talked about the importance of getting the Original Camera Files (OCF) or source files into Frame.io.

This is now an imperative part of the process for a software-based synchronous review session. It’s likely that the post facility has all the ARRIRAW source OCF, so we have them upload all the assets to Frame.io to make them completely central to everyone.

Next, once the editor does a turnover for the DI and mastering, the conformist takes the XML or EDL and loads it into the new Frame.io Transfer app, our super high-speed downloader. We’ve just released it in beta specifically to help enable these sorts of workflows.

The conformist then pulls down the ProRes 3.2K files used in the cut. The good news is that Transfer will only download what’s in the list. The bad news is that ProRes 3.2K is a large, high quality file. At a facility that’s not really a big deal, but at home, you need to make sure you are prepared for this session. My internet bandwidth peaks at 400 megabits per second.

But what does that mean? Well you can use this formula to determine the best path forward: Take your source file data rate in megabits per second; which in the case of ProResXQ 3.2K is 1200, divide it by 8, which gives us megabytes per second.

So let’s round up to 160. We multiply that by 60 seconds per minute and then multiply that by 60 minutes per hour, and then divide by 1,000 since there are 1,000 megabytes in a gigabyte. The result tells us that ProResXQ 3.2K is about 575 gigabytes per hour.

Then we use the same formula and use it to calculate our bandwidth. As I mentioned, my network download performance is about 400 megabits per second. I divide by megabytes, multiply by seconds, then minutes, and divide by gigs, and I learn my network can download about 180 gigabytes per hour.

That means a sustained 400 megabit network can download one hour of ProResXQ 3.2K in about three hours. Since a pull list for a one-hour episodic conform is going to include at least two hours of total media, this is doable as an overnight task.

Obviously, slower networks will require more time and storage—but you might have neither. The solution? There’s a powerful trick within Transfer that simplifies this workflow.

Let’s say the cinematographer doesn’t have access to a high speed download network, or the ability to store terabytes of data or to play back large, high quality files.

Well, just prior to initiating the download in Transfer from the Frame.io cloud, you can select “Proxy” instead of the default “Original.”

When Frame.io uploaded the ProResXQ 3.2k LogC files, it automatically transcoded an AVC version of the original file for web optimization. This file, of course, is not as good as the original, but it matches the filename, timecode, resolution, aspect ratio, log encoding, and color space.

Frame.io simply makes it available to you as a valuable asset. And a high quality source file like a ProResXQ from an Alexa will compress beautifully into a log AVC file.

What that means is that Frame.io created a 10 megabit, log-encoded AVC, still in 3.2K, and instead of being 575 gigabytes per hour, this file is more like 5 gigabytes per hour.

So with this Frame.io tool, a DP can automatically download the entire episode of selects via Transfer, in high quality, in less than an hour—even on a below average network.

In fact, even the colorist could download the AVC files first, and start the grading process using proxies while the Source OCF files download in the background.

Resolve Remote Color

In either case, we now have a media foundation for a powerful workflow for colorists who use Resolve that is already built into the tool—Resolve Remote Color.

This is a function that allows multiple people to sit in on a color session in different places, with realtime results. The beauty is there is no specialty hardware or video streaming. This is really one of the only setups of this kind, and it’s very elegant.

The colorist using Resolve Studio acts as the “host” and the cinematographer, who also has Resolve Studio, acts as the “client.

There is a little setup required, but it’s relatively simple and we’ve made a setup guide that you can download to help step you through the process.

First, the editor uploads turnover lists to a Frame.io directory so there’s a central place to manage them.

Once that’s received by the conformist or colorist, the EDL is uploaded into Transfer, where all the selected source files are downloaded (again, in our case study, this is Alexa ProResXQ 3.2K).

At the same time, the cinematographer logs into Frame.io and downloads the same list into the Transfer tool, and initiates downloading the proxy files. This takes less than an hour and can play back without issue on a laptop hard drive.

After conforming in Resolve, the colorist uploads a DaVinci Resolve Project into Frame.io (known as a DRP file) which is also downloaded by the cinematographer.

The cinematographer can simply drag and drop the downloaded video assets into Resolve, open the DRP, and now we have two Resolves with identical media, and an identical conform. The trick here is that the colorist has full quality log files on a RAID, and the client has proxy log files on an internal hard drive.

Now we go to the color menu and select REMOTE GRADING.

In order to make this work we have to securely connect two computers together. There are two ways to do this. One is using a virtual private network or VPN, and the other is using what is known as Port Forwarding.

For groups that have strong IT support and stronger security protocols, we recommend getting help to set up a VPN so the colorist and the client can share their computers. But if you’re doing this yourself, it’s simpler to try Port Forwarding.

With Port Forwarding, Resolve talks to the remote colorist’s network, which then gets directed to the colorist’s computer. The remote colorist sets up the forwarding rule in their router (note that every router is different, so you may have to refer to your router’s documentation for how to do this).

If you’re wondering about security, since Resolve is listening to one port, only one setup at a time can be accessed through this kind of network. A security breach would have to sniff your IP address, identify what port your service is on, and they’d have to expose your computer through the Resolve service on that port.

Resolve allows you to accept or reject access when the port is opened, so the chances of this happening is minimized.

Other safeguards such as a firewall could easily be implemented to make sure the traffic is only going where it’s supposed to. This will still use your public IP address, but it makes sure that other traffic that tries to talk with your computer never has the chance.

The proof of concept

In this episode, I worked with colorist Nick Lareau, who has been working from home in Vermont. After Nick sent me his Port Forwarded IP address, I typed it into Resolve’s Remote Color prompt, selected Connect, and Nick granted me access.

We now have a high fidelity, audio-enabled, software-based grading session going simultaneously. Remember, because we have media on both ends of the pipe, there is no screen sharing, so the data being transmitted through the network isn’t video— it’s play, stop and grading commands—which are ultra low data exchanges, similar to how online gaming works. This can be done on a Mac or on Windows, and, most importantly, I can even use an external display and watch the colorist work in real time.

Nick and I communicate using an online chat system and can take advantage of the benefits of synchronous review, much like working in a facility. Nick is driving the session and able to view on a calibrated SONY OLED display.

As a client, I’m using an Apple XDR monitor which I’ve set up in a dark room. I can see every adjustment Nick makes, including the actual color controls, keying, use of secondaries, and custom LUTs.

And most notably, Nick is working with the source files on a direct attached RAID, and I’m able to see the same results with proxies running off my laptop.

If the director or cinematographer doesn’t have an SONY BVM or Apple XDR, many of us now have 4K HDR televisions that use OLED or LED or Quantum Dot technology. There are lots of resources out there on how to calibrate each of these tools

I personally own a 55-inch LG C9. What’s more, if I need to do remote grading for an HDR or Dolby Vision session, Nick and I can set up our Resolve session as a Dolby Vision or HDR10+ session, and I feed an HDMI cable out of my laptop into my LG, and now I’m looking at a calibrated, 700NIT HDR output in real time with a colorist 3,000 miles away.

Because you can upload just about any file to Frame.io, if it’s an RGB file like ProRes or MXF, Frame.io will give you an automatic transcode to share, and because you can load just about anything into a Resolve timeline, you can use this same technique for VFX reviews, edit reviews, final screenings or even QC.

In all the articles I’ve read, all the interviews I’ve done, and all the correspondences I’ve had since starting this series, there are two things I’ve discovered that are really interesting and keep coming back to me.

First, the individuals and companies that have a history of being willing to think outside the box are the ones that have deployed efficient work-from- home solutions in virtually no time. Some groups we’ve talked to migrated their entire company and their entire pipelines to work from home in as little as one day.

And second, just because we’re all deploying remote workflows doesn’t necessarily mean we have to make qualitative compromises. In other words, if you find yourself in a situation where the work from home isn’t flowing, question it. A good portion of the future of all commerce, let alone media and entertainment, is going to remain virtual.

If the history of cinema technology and toolset transitions has taught us anything, it’s that we like to keep things the way they are. Some of you think setting up remote work is going to be like a camping trip, when in actuality, this is going to be an entirely new foundation. And why go back to normal when normal can be improved upon?

I hope you’ll try these techniques, build on them, provide feedback to the manufacturers, share your ideas, and most of all, come at change with an open mind. This is the perfect time to clear the deck, work together, and form some new habits.

Michael Cioni

Michael is the Senior Director of Global Innovation, Adobe.

How a Pro Colorist Maximizes Color Separation

Should You Use Resolve Color Management or CSTs?

The Ultimate Guide to Premiere Pro Productions