Every February, the Hollywood Professional Association (HPA) holds their annual Tech Retreat, a gathering of thought leaders in creativity, technology, and business, who come together to discuss and debate the topics at the forefront of the media and entertainment industry.
Members of the Motion Picture Academy, Television Academy, American Society of Cinematographers, program leads from NASA, studio technology teams, and notable tech mavens are among those who help shape the future of motion pictures, often by introducing game-changing technology that marks a significant turning point or industry “first.”
This year’s focus? Cloud-based workflows.
The challenge? To make a short film using as many cloud technologies as possible while chronicling a new, hybrid cloud workflow from start to finish.
It’s why Frame.io’s very own Global SVP of Innovation, Michael Cioni, a longtime member of the HPA, leaped at the chance to show our proof of concept camera-to-cloud effort to the 2020 retreat that was held last month in Palm Springs.
Michael, along with Frame.io CEO Emery Wells, VP of Information Security Abhinav Srivastava, and Workflow Architect Patrick Southern, demonstrated a brand new proof of concept of a hybrid camera-to-cloud workflow—while more than 500 industry experts, influencers, and decision-makers watched.
The project
Spearheaded by HPA President Seth Hallen and led by Joachim (JZ) Zell, VP of Technology at EFILM, the crew of the short film entitled The Lost Lederhosen included Steven Shaw, DGA, ASC (director); Roy Wagner, ASC (cinematographer); Peter Moss, ASC, ACS (camera operator); Sam Nicholson, ASC (VFX camera); Barry Goch (editor); and Stephen Morris (Skywalker Sound).
Cameras were provided by ARRI (Alexa LF), RED (Monstro), Panavision (DXL2), Blackmagic Design (Cinema Camera), and SONY (Venice).
Michael and JZ have known each other for years, and in November of 2019, Michael had given JZ a sneak peek into how Frame.io could be used to centralize a cloud-based workflow.
So for its debut at HPA, Frame.io, along with technology partner Teradek, took on the task of realizing JZ’s vision to create the experimental film by relying on as much cloud technology as possible, with Frame.io at the center of the entire process.
Michael told JZ, “There are numerous tools that work effectively in the cloud and offer virtual services, but Frame.io is the only cloud-based tool that acts as an operating system that touches most of the key professional video aspects that the HPA will need in order for this to be a success.”
The crew spent three days shooting at locations around Los Angeles prior to the conference. During what the HPA calls the “Supersession” on February 18, 2020, they spent the final shoot day on a set constructed in the conference space, where they completed the film in nearly real time and screened it before the live audience.
Frame.io was used in the field to host and distribute all active media (dailies, stills, VFX elements, and material for review) as well as hosting passive media in the form of original camera files (OCF).
“The HPA contacted all the winners of the HPA engineering award as partners for this project,” JZ said. “With every discussion with different vendors, it became clear that Frame.io became the “glue” product and missing link for many products and workflows.”
But before we dive into the details of how they did it, it’s worth stepping back to consider how workflows have previously been constructed—and how this project represents the future of video content creation.
The bones of the workflow
If you think about a workflow from an anatomical standpoint, the many different technologies (hardware and software) it takes to produce moving pictures can be considered the “skeleton” of the process.
Connecting the bones are the corresponding creatives, which you might think of as the joints. Cameras and DPs, audio recorders and sound engineers, NLEs and editors, compositing tools and VFX artists, color correction tools and colorists (and more)—all form a production support structure. They make up the most important part of the workflow foundation, and require the most significant investment from an equipment and labor perspective.
While there are still hotbeds of film production where studios are centered and talent pools are deep (Los Angeles, London, Mumbai, Hong Kong), we live in a world where productions are becoming increasingly decentralized.
Creators want the freedom to work in the locations that best serve their stories, with people who share their creative visions and sensibilities.
So what if your bones are in one place and your joints in another? How do you bring them together in a way that lets your workflow function at peak performance, where everything is smoothly connected and optimally aligned?
Friction versus fluidity
Content creators talk about “friction” or “pain points” when it comes to workflows—where the process becomes intrusive and impacts creativity.
It’s similar to what happens when a human body suffers from arthritis. Misalignment or wear and tear cause cartilage and fluids between the joints to break down, creating unwanted friction that leads to pain.
Translated into workflow terms, the most common causes of friction in post-production are distance, time, and communication. And when those break down, pain ensues.
Fluidity is the core concept that inspires the design of Frame.io.
We strive to make the experience of using Frame.io easy and intuitive—despite its being a deceptively powerful tool. We never want users to feel as though they’re being pulled out of their creative flow—rather, one of our guiding principles is to help creatives feel more immersed in their project.
Connective tissue
Five years ago, Frame.io’s cloud-based platform began relieving pain points by introducing innovations like real-time commenting and collaboration, frame-accurate feedback, and the ability to share work-in-progress from any place, at any time.
Our mission is to keep finding ways to reduce the pain that occurs when steps in the production process break down. We know, from firsthand experience, where the friction points are most likely to occur.
That’s why we’re constantly looking ahead to see how we can learn to leverage new technologies—like the cloud and 5G networking—to create a fluid workflow that feels like a natural extension of the way industry professionals like to work.
The motion picture industry has worked in fundamentally similar ways since its beginnings. The director and DP collaborated on the set to capture images. Whether via film processing or digital downloads, the original footage had to find its way to the cutting room, where post-production began and the director and editor collaborated to shape the final film.
Even when digital capture overtook film as the primary capture medium, the tools changed, but there was still a one- or two-day turnaround to get camera originals into the cutting room. Original camera files need to be downloaded to drives. Drives must be sent to the post-production vendor to be backed up, transcoded, synced, and to have proxies made.
Those files are then sent to the cutting room, where the assistant editor ingests the material to the NLE before cutting can begin. If VFX are part of the process, the editor (or VFX editor) will need to relink to the original files based on the offline. And, finally, when the cut is locked, they’ll need to do a final color grading and assembly by relinking to the original files.
Until now, few of these steps directly connected to one another through a single platform.
In recent years, there’s been a mounting buzz around on-set editing, with editors using material from the video taps as they wait for the original camera files to be downloaded or proxies created by mobile dailies systems.
Especially on large-scale, VFX-intensive projects, directors have become more reliant on real-time technology to ensure that they’ve gotten their shot before moving to the next setup.
The unfortunate truth about the shift to digital technology is that it didn’t speed up the process as significantly as originally promised.
And given the fair amount of manual processing it takes to move through the workflow, there’s still room for errors and failures—even (and perhaps especially) at the highest levels of the industry, where creatives are constantly pushing technology to its limits.
It’s what drives us to keep pushing forward to create the cloud-based paradigm that powers the industry’s workflow.
Building muscle
It’s also why Michael Cioni joined Frame.io. His background in major motion picture workflows makes him one of the key people in our industry who understands what it means to substantively change the way motion pictures are made.
But his credo is that for new technology to be embraced by our industry, it has to feel like it’s enhancing, not impeding, the creative process.
“I’ve been toying with the idea of a camera-to-cloud solution since 2011 [he co-founded Light Iron in 2009], but the technology for it to be integrated into professional workflows was never quite ready,” Michael says.
“The last 18 months of technological progress have opened new doors, and the idea of a robust camera-to-cloud workflow for professionals is coming into view.”
Which brings us to the HPA Tech Retreat.
“Did we have a product that was ready to go to market on that day? No. But did we want to stress test our proof-of-concept? Absolutely. And how better to do it than with some of the industry’s most respected professionals?”
The HPA retreat is “the” place to take risks, demonstrate prototypical projects in a safe environment, and gather valuable feedback from some of the smartest minds in the world.
“We’ve been working on this at Frame.io and to deploy an early pre-prototype at HPA for the top thought leaders in the industry to put it through the paces. We wanted them to show us where the real-world vulnerabilities are, so we can figure out how best to move forward in our development,” Michael says.
Resistance is what builds muscle. It’s one of the reasons why we welcome skeptics. “I want to invite people to understand that we don’t have all the answers. We’re far from perfect. And we’re okay with that. I want them to know that we expect there to be resistance,” Michael says. “There always is just before a revolution.”
Disruptive technology without disrupting workflow
For the three-day shoot that preceded the Supersession, director Steven Shaw, DP Roy Wagner, camera operator Peter Moss, and VFX camera operator Sam Nicolson began by capturing footage on the different cameras at 4K, 6K, and 8K.
With every camera record trigger, the Frame.io cloud proxy engine was enabled and images flooded into the cloud for an instant review of all assets, where they were shown in both HDR and SDR for simultaneous comparison using Panavision’s onset viewing stem, LINK HDR.
LTE hotspots were used to establish connections to the cloud, so timecoded and file-named images were instantly backed up and redistributed right on the set.
“It’s an emotional moment for the crew,” Michael says, “when a director calls ‘Cut’ and a few moments later the take is automatically available on an iPad or computer or phone.”
At the end of each day, with the shoot fresh in their minds, Michael and Patrick pulled down all the footage that had been shot that day from the cloud and brought it into FCP X, into which Frame.io is an embedded extension.
Michael and Patrick were able to edit the day’s work and send review links from Frame.io to all the principals on the production that very night.
“Going through every production stage we had a plan A, B, and C,” JZ said. “While often Frame.io appeared in the plan A path, it was amazing to see how often it became the B- or C-plan if other systems failed or couldn’t deliver in time. The efficiency and speed of Frame.io was kind of scary at times!”
As the project progressed, nearly 50 people joined through Frame.io and used their connection to the cloud as a new way of collaborating with everyone involved. “The goal was to get people thinking about collaboration in an entirely new way,” Michael says. “Some of it for the first time.”
Technology company Colorfront built their integration with Frame.io so that dailies could be linked from the original camera files. “This list of assets is traditionally achieved through numerous downloads, transfers, and shipping, going through a number of different hands in the image chain of custody,” Michael says.
- Instant dailies (H.264 proxy)
- Edit files (H.264)
- Rough cut outputs
- Audio files (WAV)
- Color-timed dailies (H.264 & DNxHR)
- Behind-the-scenes elements (JPEG and MXF)
- Original camera files (OCF)
But at the center of this HPA test, Frame.io became the sole tool used to deliver, and in some cases to create, all of the assets required for the post workflow. What we demonstrated to the HPA for the first time is that all of these things came from a centralized cloud system, and that Frame.io has created a viable cloud-based workflow.
Finally, for a truly cloud-based workflow to succeed, it needs to have hooks for end-to-end integration. After the rough cut was completed, an XML was exported from FCP X and loaded into Resolve and then downloaded to a local drive that also contained the original camera files.
This hybrid workflow proved that it is possible to host the original camera files and power the offline edit from the cloud, and then relink back to them locally for the conform. The timecode and metadata captured in the live streaming assets on the set were sufficient to link back to the original camera files for the final digital intermediate.
The proof of the concept is in the viewing
The Supersession day began with attendees watching presentations that were directly next to where the live production for the remaining scenes was taking place.
Frame.io was onstage using our camera-to-cloud proxy engine to capture all the takes in the moment and stream them instantly to the cloud, enabling the attendees to watch the live camera feed on two large screens.
During the Supersession lunch break, data was wrangled so that after lunch the VFX team could work on compositing, the editor could update the cut, audio could be mixed, and the final color grading could be completed in preparation for the final screening that evening.
Meanwhile, while final post-production was ongoing in the afternoon, Frame.io presented its portion of the demonstration onstage, which included a behind-the-scenes video (captured during the morning’s production) of the camera-to-cloud workflow for the attendees to view.
Reading about the process is one thing, but watching the video really drives the point home. You’ll see how we captured a moment from the cameras and streamed it to all the post-production tools (the “bones”) that are used in the normal post workflow in real time.
For the first time, Frame.io was the connective tissue that delivered all the assets to all the creative entities without having to download them to a drive. We replicated the same post-production workflow motion picture creators are used to—but without anyone having to wait for a drive to be shipped.
“For this year’s 25th Tech Retreat, the HPA program committee set out to bring truly next-level programming to the industry. We were extremely proud to enable the most cloud-based technologies ever assembled into one workflow and to present them live in front of hundreds of people,” Seth Hallen says.
“The only way to achieve this was to bring the community together in a truly collaborative way to innovate and learn. A few dozen companies made this happen, but the project would simply not have had the raving success it did without Frame.io, whose tech became the key enabler to the whole experiment.”
“Under the leadership and passion of Michael Cioni, it was so exciting to watch true innovation and novel next-generation workflows take shape in real-time. We can’t wait to work with Michael and Frame.io on next year’s project!”
The wrap
The combination of technology and creativity is what Michael likes to call being “technative” (technological + creative). It’s about taking risks to power creative potential by leaning into technology.
At the end of that day’s Supersession, we accomplished what we set out to do with our proof of concept. We proved that Frame.io isn’t just about creating a product. It’s about bringing together a community of creators and helping all the contributors work more seamlessly and fluidly.
In the same way that a human body works when it’s fit and healthy, you don’t notice what’s going on inside it. You just know that it responds to whatever you want it to do, whether that means making quick bites of content to be viewed on a mobile phone or jumping out of planes to capture extreme aerial footage for IMAX blockbusters.
And it’s just the beginning.