Made in Frame: Let’s Go! will.i.am and J Balvin Get Unreal with Huffman Creative

From the moment MTV went on the air in the summer of 1981, music videos have been known for their boundary-pushing concepts and cutting-edge production techniques. For many artists, it reflects and informs their signature style.

will.i.am is one of them. The multifaceted superstar and entrepreneur is a passionate technology explorer whose interests have led him to design smartwatches and iPhone apps, going as far as to even make interplanetary music history as the first person to stream a song from the surface of Mars.

No surprise that “Let’s Go,” his most recent music video collaboration with J Balvin, finds them traversing fantastical environments in an ultra-slick race car. The collaboration between the teams at Huffman Creative, JAMM VFX, and The Storyteller’s Desk, harnessing technologies including Unreal Engine, Houdini, Maya, Adobe Creative Cloud, and Frame.io, resulted in a mind-bending XR experience that once again pushes the artistry of music videos to new heights.

A high-concept video

VMA-winning director Andrew Donoho, known for his work with a who’s-who of artists that range from Beck and Paul McCartney to Run the Jewels and Skrillex, had previously worked with J Balvin. His video for “Toretto” features Balvin showing off his drifting skills for a duly impressed Vin Diesel.

So when the concept for this video came up, Andrew was a natural choice, and Balvin is once again behind the wheel as he races will.i.am through five impossible environments.

The concept was to create an XR (extended reality) experience that hewed to a more realistic aesthetic versus looking like it’s part of a video game. The viewer should feel as though the car is in an organic environment, with natural lighting and reflections that sell the illusion.

“Will and I connected directly and spent a couple of months building the creative from the ground up and bouncing ideas back and forth,” Andrew says. “In his music videos he exists in a kind of near-future environment and he really likes science fiction and new tech, so we wanted to do something that had a lot of different backdrops and locations. It needed to be shot all in one day, and we were dealing with a car that had a lot of chrome, which meant there would need to be a lot of reflections.”

“The idea was that we should use an XR stage with a volume and Unreal Engine so we could build out an elevated version of his sci-fi world and could also incorporate a lot of variety in the lighting that would accommodate this reflective world and elements that really wouldn’t be able to exist on green screen,” he adds.

If you’ve ever worked on music videos, you know that they’re incredibly challenging to pull off. Especially at this end of the spectrum, they’re something akin to a Super Bowl commercial, demanding ultra-high production value, but on an often compressed schedule or reduced budget. So when you’re talking about embarking on a project like this, you need a lot of ingenuity and some very capable hands.

When looking for the perfect design team, Huffman Creative’s Head of Production, Katie Sarrels, reached out to JAMM’s Executive Producer, Julie Weitzel. From previous experiences with other high-level projects, Katie knew that JAMM would be the best team to help execute this concept flawlessly. With Huffman handling creative concepting and the production, and JAMM handling the VFX, they brought together the right mix of talent and brains to pull it off.

Planning is key

The team at Huffman Creative are no strangers to complex productions. As a full-service creative studio that handles everything from initial concept and bidding to location and technology research, writing, talent contracts, and all phases of production (practical and virtual), as well as the full spectrum of post-production needs, their work spans commercials, music videos, photography, live events, and more. And with talent relationships with artists like Bad Bunny and Ariana Grande, sports figures like Mike Trout and Damian Lillard, and numerous other influencers including TikTok star Bella Poarch, they’ve produced countless projects with viral results—and amassed prestigious awards along the way.

As a frequent collaborator with Andrew, Executive Producer and Company Founder Ryan Huffman knew that although this particular video would come with its own unique set of challenges, he was confident that once again they would assemble the right team and solutions to tackle it.

Although Ryan tends to function as EP on a day-to-day basis, he also lent his expertise as a producer and conceptual developer for this project. “Because of how complex XR is, you have to go through a lot of the stages of pre-production in order to lock the budget,” he says. “On a traditional shoot you might be able to say, ‘We’ll just get a mansion and figure it out from there,’ but for a project like this you have to go deep on storyboarding and other types of concepting before you really know how much it’s going to take to execute.”

On a traditional shoot you might be able to say, ‘We’ll just get a mansion and figure it out from there.’

It meant that a lot of key players had to be involved early in the process, including production designer John Richoux, set designer John Doni, and art director Nick DeCell. “From a creative standpoint you have to go very deep to make sure that we’re presenting something that’s feasible and looks good,” Ryan says.

John Doni utilized Photoshop and SketchUp in the previsualization stage, pulling imagery and textures from Adobe Stock to create the visual palette that would be used not only for the virtual backgrounds but also for the actual physical elements they built on the set. Between the creative team being scattered across Los Angeles and will.i.am traveling, the many collaborators relied on Frame.io to stay in sync creatively.

“I actually didn’t meet with Will in person until the day before the shoot,” Andrew says. “We would cut the storyboards in sequence to the music so we could share them with him and he could drop notes directly on the frame. Same with the mockups of the environments and renders from JAMM.”

will.i.am had done a previous video in an LED volume but wanted to push the process even further with this one. Andrew estimates that they spent about two and a half months prior to the shoot to get everything prepared. “We built visual decks and treatments and used AI for some of our concepting, and then took that over to JAMM to start building them into a physical reality in 3D.”

Starting the (Unreal) engine

Working with new technology requires a steady hand on the wheel, and VR producer Tom Thudiyanplackal, an experienced Unreal Engine filmmaker, was brought in to steer the XR component with his company, The Storyteller’s Desk. As a member of the USC Entertainment Technology Center (ETC), his work on the Cannes award-winning student film Fathead was documented in the ETC’s subsequent white paper, in which Tom details the elaborate process of pushing the technology to new places as a test case for future mainstream productions.

Like this one. After lengthy process of interviewing each VP stage in Los Angeles, the team chose XR Studios, which has two stages—one with an LED floor and another with a practical floor—allowing them to capture everything in a single day. It also enabled them to use practical set pieces and materials that would help sell the realistic look Andrew was after.

Andrew explains: “Because you’re trying to capture everything in camera—the lighting and shadows, the reflections, the textures—what XR offers is that the art department was actually able to build out what the talent were standing on, what they were touching, what was around them. The screen then provides the backdrop and they can merge that floor plane and that ground plane with the background. We color matched the seams so that the practical sand flows into the sand of the background world.”

But, again, this requires lengthy and meticulous preparation. Tom worked closely with VFX supervisor Troy Moore and his team at JAMM as they created the models and environments in Houdini that would play in the volume, figuring out the details of what it would take to properly project the images on the LED walls.

“Houdini is a great tool for procedural generation and it’s wonderful for a director or creative person to work with an artist and be able to see their world come to life, but it’s not such a great tool for real time,” Tom says. “The main challenge is that the file sizes are pretty large and the file types may not be compliant with the pipeline you need for real time. So you first have to bring that content into Unreal Engine and basically shave it down so it still holds all of the beauty that was visible within Houdini.”

“The artists would export the meshes and textures from Houdini and reassemble pretty much everything within Unreal Engine to optimize it so that we get a lag-free performance on the wall. The general math is that if you’re trying to hit about 30 fps on the wall, you try to hit about 90 fps on your computer, so that when the sequences pass through the end display pipeline—even when you have some loss of processing and frame speed—it still delivers 30 fps on the wall.”

Working in concert

This also required the JAMM and the Unreal team to work ahead of time with Andrew and cinematographer Idan Menin so they could accurately create the backgrounds to match into what the camera would capture. “We like to be more disciplined within virtual production—to have an idea of how much the camera’s going to move, what kind of lenses are going to be switched to during a shot so that when you build the environment you know where the virtual edges will fall off,” Tom adds.

Andrew and Idan, who collaborated on two previous projects, chose to capture on the new Sony Venice 2 (recording at 8.6K X-OCN ST), which Andrew describes as “a real treat. Its dual ISO also allowed us to brighten everything in camera without losing anything, so that way we really could use the screens to illuminate the subject. One of the big advantages of the LED volume is that when you catch the sunset behind the talent, it will actually illuminate them the same way that a real sunset would.”

The team also used a Sony FX3, recording at 4K ProRes RAW. And then there were the lenses. “We used the same 70mm anamorphic lenses that were used in The Hateful Eight and The Mandalorian,” Andrew says, with genuine excitement. “It’s a really fun approach because you take this very old, gorgeous glass that has a huge scope and interesting personality, and then you put it onto this world and it really helps ground it in reality.”

“You have all this new technology, these new cameras, everything is super crisp and sharp and beautiful. But then when you put these gauzy vintage lenses on there, it chips away at the digital edge, it makes it feel even more organic, which again is one of the things you can’t do on green screen, because you want everything to have crisp edges for keying. But with XR you’re able to get those lens flares, you’re able to get the softness, you’re able to let the lenses bend a little bit because what you’re capturing in camera is your actual shot with the visual effects included.”

You’re able to let the lenses bend a little bit because what you’re capturing in camera is your actual shot with the visual effects included.

Tom confirms that the choice of vintage lenses also helps to mask any of the less performant aspects of the volume technology. “Anytime something’s not 100 percent there, filmmakers have a wonderful way of masking or working with it. With any form of lens, cinematographers will never call the imperfections of the optics of the lens imperfections,” he states. “They call them characteristics.”

“In this case they only added to the whole process because the walls are not necessarily at a place where they’re 100 percent ideal for film work. We don’t want to look at those things with too keen an eye because it won’t be perfect. So in the case of using these anamorphic lenses, they added great character to the image and made it much more seamless in terms of the integration of the physical world with the digital world.”

After concepting, storyboarding, creating the 3D elements, and adapting them for playback on the volume, there’s one more critical step: taking the time to prep on the stages prior to rolling the camera. While the environments themselves didn’t need to be carefully timed for playback, there are moments in the video in which, for example, the lighting goes from daytime to sunset or the car goes into and out of a tunnel.

“Elements like that needed to be timed to the music as best we could,” Andrew says. “JAMM was able to focus on the visuals and the Unreal team made sure that we had the flexibility we needed when we started dialing in the specific lighting cues and movement speeds. Thankfully, they had built us a bunch of animation elements that showed how this would look in motion so that we had a strong reference going into timing them.”

Idan can’t overemphasize the importance of preparation. “When shooting a virtual production it is so crucial that there is interdepartmental communication. The VAD (Virtual Art Department), art department, lighting, camera, and stage teams must all work in concert to pull off photorealism in camera. The other challenge was doing all that within the constraints of a non-volumetric virtual production stage,” he explains. “There is a common misnomer that all virtual production stages are called “volumes.” This is only true, however, if the stage itself consists of a 360-degree wall and ceiling. Although XR Studios had the two stages, which allowed us to capture all our setups in one day, they did not have 360 degrees of LED wall.”

Which meant that his team also had to manipulate the lighting to meld the physical world with the virtual one. “We were tasked with continuing the walls of the stage with traditional lighting tools, which proved challenging both technically and logistically. We brought in and built numerous custom frames of muslin with creamsource Vortex 8 lights that continued the lighting effects of our wall, while being short enough in height to not block the collection of witness cameras from tracking the Sony Venice. Having this array of Vortex lights allowed our programmer to simulate, and when necessary, animate lighting cues in sequence with the LED wall,” he says.

Tom specifically cites the space dance sequence as one example of how they pushed the technology to achieve amazing illusions. “The stage that has the LED floor integrated with the wall is a very sophisticated technology where, from the point of view of the camera, it’s as if the floor and the wall disappear and you start to peer into the dimensionality of the 3D world. will.i.am and the dancers were on top of a physically constructed platform on top of the LED wall and from the perspective of the camera we were able to create the illusion that they were floating through space among the buildings and a huge city—all in camera.”

Cutting the right corners

If you’re keeping count, there were four separate teams that needed to collaborate closely during the pre-production and production stages (Huffman/Andrew, JAMM for VFX, The Storyteller’s Desk for Unreal Engine, and XR Studios)—as well as the talent and their management.

And then there was post-production, with Andrew himself editing, additional VFX work by Denhov Visuals, and color grading by Matt Osborne and the team at Company 3.

During every phase, Frame.io helped keep the teams on course. But in post-production, it played an even bigger role.

“We put a lot of the budget into the previs and set up and XR,” Andrew says. “The goal was to walk away from production with something that was almost there, so at the end I put together a smaller, scrappier team for post. I have a background in visual effects and the Adobe tools, and I know how to mix and match the software to make it efficient.”

It’s rare for a production to emerge without having encountered detours or roadblocks. In this case, the race car they were using was an earlier model. And because this video will also serve as a promotional piece for the racing entity itself, the team was required to update the car in post.

Denhov Visuals worked primarily in After Effects and Nuke to make those adjustments, and Andrew relied heavily on Frame.io to leave frame-accurate annotations and notes on the work with that team—as well as for sharing cuts and assets as he edited in Premiere Pro and DaVinci Resolve.

There’s nothing else out there where I can upload all the raw footage and then add comments to it and still make it seem like something that’s presentable.

“I love that I can draw on the frame to very specifically point out what needs to be removed, or what needs to be brightened or darkened within our actual edit. We were adding the car parts and the artists were able to grab screenshots and downloads from Frame.io at different aspect ratios and file sizes so that they could do previs on stills.”

Andrew adds, “Frame.io is also amazing for huge file dumps. There’s nothing else out there where I can upload all the raw footage and then add comments to it and still make it seem like something that’s presentable. And then as we got VFX cuts in and elements in from our post teams we’re able to again seamlessly reference old edits or notes, which makes the QC process so much easier. Because unless you have a massive team or infrastructure to manage all the files and assets, you need to have it all in one place so that everyone can get what they need.”

Throughout the editing process, Andrew was also sharing cuts with will.i.am and his management team. “Sending them through on Frame.io gave them the chance to ask questions and to respond to my notes,” he says. “I also really love that instead of the traditional placing text on screen for explaining what’s going to change and what the effect will look like in the end, you can just drop that into the review section and they can see it right there.”

Present, past, and near future

If, as Andrew says, will.i.am exists in a near-future world, “Let’s Go” took the entire team to the near-future of where motion picture technology is heading. Yet, Andrew appreciated the aesthetic of using vintage glass with brand new digital cameras.

Along those same lines, there’s an aspect of working in virtual production that leans decidedly toward techniques that have been used in television for decades.

“The one thing to keep in mind about virtual production, especially when you work with LED walls, is to only trust what the camera sees,” Tom states. “We have come to a place where the wall is essentially an electronic signal and so is the camera sensor. So we’ve moved into the territory of broadcast and have to rely on the strengths of using scopes to know what the image is rather than our eye or a calibrated monitor. The habit is to fall back onto using meters but you can’t trust your eye. Trust what the camera sees.”

From the cinematographer’s standpoint, there’s also a change in mindset that occurs with virtual production. “In a way, it’s reversing some lazy habits that the industry has adopted in the wake of the digital revolution. The ‘fix it in post’ mindset does not meld well with this process and to pull it off in its best form, the culture of filmmaking needs to return to committing to choices up front in prep,” Idan states.

The more we can prepare and make decisions early, the better set up we will be to be inspired, react, and pivot on the day of the shoot.

“VFX is still a huge part of this process but it is there to enhance and build upon choices made long in advance of production. This project pooled together a wonderful team of people committed to making those decisions ahead of time and seeing it come together was inspiring. Going forward, I’d like to double down on the ‘prep matters’ mindset as I find that the more we can prepare and make decisions early, the better set up we will be to be inspired, react, and pivot on the day of the shoot.”

And then there’s the aspect of the way teams are collaborating in our new reality. “What once would have required all of us to be at JAMM’s office for sessions, then going to the Unreal team and then Will having to fly in to see things—as of 2020 that doesn’t really exist anymore,” Andrew says. “When you are remote and in our new 2023 world, it’s now possible to do a project of this complexity without being in person. It was definitely a very wireless workflow.”

Taking the lessons of the past to push technology forward? You could almost say that it’s a little like creating a modern, super-performant race car.



Lisa McNamara

Lisa McNamara is Frame.io's senior content writer and a frequent contributor to The Frame.io Insider. She has worked in film and video post-production approximately since dinosaurs roamed the planet.