Made in Frame: Cutting the Thriller “Searching” in Adobe Premiere Pro

Highlights:

  • Searching breaks the mold and sets a new standard for the “on-screen” film.
  • Creating a feature-length “animatic” of the full film (with the director filling in all the roles) prior to shooting.
  • Premiere Pro’s integration with Illustrator and After Effects played huge role in the creation of media used in the film.
  • Every element on the screen was created by hand in Illustrator, and not by screenshots.
  • There were three stages of post-production, starting even months before shooting began.
  • Director Aneesh Chaganty loved using Frame.io and the directors and producers used it…a LOT.

In Every Frame A Painting’s 2014 video essay A Brief Look at Texting and the Internet in Film, Tony Zhou notes that, “many have tried, [but] we still don’t have that one really good way of depicting the internet.” Although he goes on to praise the then-new concept of the “desktop-film,” he quickly bemoans the lack of innovation when it comes to American films of the genre.

Four years later, Searching appears to be that answer.

Set entirely on computer screens, Searching stars John Cho (of Star Trek and Harold & Kumar fame) as a father desperately trying to find his missing daughter by searching her laptop for clues.

On-screen films may seem like familiar waters (Profile and the Unfriended films quickly come to mind), but for Searching, director Aneesh Chaganty needed a post team that could help him bring a more cinematic approach that still hadn’t fully been explored within the genre. For that, he enlisted the help of editors Will Merrick and Nick Johnson.

Where most of these films primarily hold on a wide of the screen, almost like a play, with Searching there are a variety of traditional camera moves and compositions like close-ups, montages, and pans.

In order to bring this vision to life, the Searching editorial team ended up building their own workflow, using motion graphics techniques from After Effects to craft computer-screen “performances” that accompanied the powerful live-action performances of John Cho and Debra Messing. The result is a film where even the movements of the screens and cursors are evocative.

When’s the last time the movement of a mouse made you cry, or kept you on the edge of your seat?

Getting involved

While the two editors had never met before (let alone worked together), they shared an alma mater and had both previously collaborated with Aneesh on separate projects.

“We both went to USC [University of Southern California], but we actually never met there,” Will told me at their offices in Los Angeles.

“We both knew Aneesh as well. I edited this Google spot for Aneesh [Seeds, a short film shot entirely on Google Glass] that got him a job there. He worked there for two years, and when he came back, he hit me up to edit this weird movie he was making that all takes place on a computer screen.”

Seeds, directed by Aneesh and cut by Will, was shot entirely on Google Glass.

Nick had his own history with the director in film school, sharing several screenwriting classes and DP’ing a short Aneesh produced in 2012. After he graduated, Nick edited features, shorts, and mini-series. When the Searching team realized they’d need another editor to share the workload, Nick was Aneesh’s first phone call.

Choosing Premiere Pro

Timur Bekmambetov, who produced Searching, the Unfriended films, and directed Profile, has become a huge champion of the on-screen aesthetic; he even developed the “Screenlife” software to aid films like this.

Most of the “desktop-films” I mentioned earlier were either cut in Avid (the dominant industry NLE in Hollywood) or Final Cut Pro X. But despite the suggestion by Timur’s production company (Bazelevs) to use one of these programs, Will and Nick quickly agreed they would edit on Premiere Pro. Will and Nick felt right at home in Premiere and relied on its integration with other Adobe products as part of the post-production process.

And as you might expect, a film like Searching had a very unconventional production process, especially when it came to post, which they referred to as three stages: Pre-Post, Post, and Post-Post.

Related: Four Tricks to Make Premiere Pro Exports Faster

Stage one: pre-post:

Working together

“We met with Aneesh a month or so off-and-on before the crew started shooting,” Will recalled. “That was just what he called ‘rehearsals with editors.’ So we were just talking through the movie. We threw frames in from the script. ‘This is what this is going to look like.’ ‘In this scene, the mouse is going to go over here.’ ‘What does this window look like?’ Then we got in the editing room, and we edited for seven weeks and basically put together an animatic. It was really an unusual thing, because, as editors, you very rarely come to a timeline with a totally blank slate with no assets at all. So we were just there creating our own assets basically just doing rough screen recordings.” They primarily did this by taking screenshots and screen records of each other’s text messages and FaceTime calls.

To properly divide up labor, they broke the script down into 26 sections identified by an A-to-Z ‘editing scene code,’  generally broken up by natural stopping points since the film didn’t have traditional slug lines or scenes.

“For a lot of action, you break down by physical location, because that’s the easiest metric,” Nick said. “And we didn’t have that in the script; the script was originally a ‘script-ment’ [a mix between a script and a treatment], so we were just doing it by chunks and scenes.” Production later turned that into a 200-page traditional script format for the assistant director to use on set.

As they each cut their 13 given scenes, they would then trade them back to the other so each editor would get a pass on the scene. According to Nick, “It was great to have somebody else there to bounce ideas off of, especially in terms of workflow.”

Will agreed, “Imagine going wrong, and you’re the only editor and you can’t even explain where you went wrong to the director and producers.”

Adobe Premiere Pro out-of-the-box

You might assume the duo used several plugins to make this film happen, but they really didn’t. “We weren’t using particle engines or anything insane like that,” Will noted. In fact, Nick and Will relied almost entirely on Premiere Pro’s basic out-of-the-box capabilities: as they put it, they used a lot of cropping, a lot of frame holds, and a LOT of nests—with sometimes 45+ video tracks of assets.

This became a challenge at times, especially when it came to adjusting timing. Will said, “We’d do a lot of little timing adjustments, so we’d have to go in, adjust the timing of say 20 or so tracks inside the nest, remember how many frames you moved it by, move the camera keyframes, and then we’d have to move all the sound assets as well.”

Despite the few challenges they faced, ultimately it was Premiere Pro’s tight integration with Illustrator and After Effects (which both played a major role in the creation of the film) that made it all worthwhile. More on that later.

Directors of “virtual photography”

The nests could be frustrating at times, but they became necessary as the editors worked closely with the director in these early stages to add the more cinematic elements of the film.

Nick said that, “We would create camera moves just by using motion keyframes on our nests that had the entire computer screen in it, and we could focus wherever we needed to at any time. It sort of grew iteratively through the whole editing process. We would come up with cooler camera moves, and in general we found ourselves going in for fewer wides and more close-ups as we kept editing. Then at the end we even added some handheld motion to some of the shots.”

In film school, you’re generally taught the basics of camera and editing rules; but with Searching, it quickly became apparent that some rules were made to be broken. Will said, “Early on I remember talking through it just in theory, like, ‘Well if we’re over here in a close-up on one side of the screen we can’t cut over here to another side of the screen,’ which ended up being totally false. We were still hashing out what the cinematic rules were, and continued to develop them all the way up until we were locking. We just got more comfortable.” Nick added, “I think we could do even more now”.

The amount of visual input the editors now had control of with on-screen assets led co-writer/producer Sev Ohanian to offer them a new, additional credit: “Directors of Virtual Photography” (and don’t worry, DP Juan Sebastian Baron approved).

Finishing the pre-production animatic

For the live-action elements of the animatic, Will said that they, “Assembled it by using screenshots of Aneesh’s face and voice as all of the characters. The only problem being Aneesh talks at 1.5x the speed of a normal human being, so even though our animatic was an hour forty minutes (which is the exact runtime of the finished film), there was a lot of fat in there because he was talking much faster and emotional beats weren’t there yet.”

Nick concluded, “So we trimmed a lot, but it ended up slower-paced. But yeah it ended up at the same runtime.”

Stage two: post

Cutting during production

With full animatic and script in hand, the crew set off to shoot Searching in 14 days—largely on GoPros—and while they were in production, Nick and Will were already adding in footage from each night’s shoot into their sequences.

Will said, “They took our animatic, showed it to the crew the night before production, and then they shot the entire movie using that animatic as a reference. So John [Cho] and Debra [Messing] were able to see what would be on their screens, and where their eyes were going. Then we just slowly cut throughout production; as we were getting dailies every day, we’d watch them and cut them in. That was the traditional part of the process.”

Organization

Nick had already cut a handful of features before, and according to him the organization process on Searching wasn’t too different. “We had edits, we had our sequences—both by reel and by section. The fundamental organizational process that we did right from the beginning was to break the script down into sections. That allowed us to work on things simultaneously, but also organize the assets that are specific to, say, ‘montage A,’ into a bin that’s for ‘montage A,’ which is then broken down into apps.”

Will also noted that, “Our graphics folder was our biggest and probably most non-conventional part of the movie, we even put it above our footage folder. We also had a general folder for things like mouse cursors, pointers, desktops, things that recur constantly.”

Collaborating in Frame.io

Because so much work had been done before the production crew ever stepped on set, Nick and Will had full rough cuts pretty early on in the process. When deciding on how to best receive feedback from the director and producers, Frame.io quickly came to mind.

Nick recounted. “I think the very first meeting we had here at this table, we brought up, ‘well can we use Frame.io?’ I had used it at BuzzFeed (who use it a ton there), and it was instrumental and a great way to take notes. I hate getting notes that are not attached to time.”

Will agreed. “There’s nothing worse than getting a few different emails from people and one of them is just random notes and another one is notes by a different timecode than you’re using.”

According to Nick, the film’s director quickly fell in love with it. “I don’t think Aneesh had used it before, so the first day of doing notes he was like, ‘ah, this is great!’ We’d come in in the morning and have a full list of notes. And it was great because we could just go note-by-note and mark them as done. Frame.io really lends itself well to people who give specific notes the way that Sev and Aneesh do.”

Get a FREE trial of Frame.io

Stage three: post-post:

After they locked picture, they went into post-post—which was primarily the creation and animation of all the effects and final on-screen assets.

After Effects and mouse movement

Premiere Pro has its limits for what it can accomplish in terms of motion, so the post team made solid use of After Effects. As Nick put it, “In After Effects we used motion sketch for all the mouse movements [in the rough cuts]. It was great because it just captures those keyframes, and then we’d go into the graph editor and just kind of stretch them a little bit if we needed the mouse to arrive somewhere. It gave it a kind of nice, natural motion.”

But because the mouse was such a crucial extension of a character’s feeling, Will noted that it often took several versions to get the motion just right. “We really did a lot of iterations and just kept showing it to people, asking what they thought. If you watch the very first animatic, a lot of the nuance wouldn’t be there.”

They were doing everything they could just to compile a shot, using various iterations of animated mouse movements to elicit different emotional queues. “We might move the mouse up, pause it a little bit, and then move it up some more, and we’d find that that adds a little bit of hesitation for the character. It was just discovering things like that to bring in more nuance and trying to inject a little bit of emotion into it.”

Illustrator

While they made great use of screen recording throughout editing, Nick mentioned that, due to the limited resolution, those recordings could never be used in the final product. “If we had left screenshots in, would have been really pixelated as we got close. Even with Photoshop, it would’ve fallen apart at a certain point too.”

Their solution, as Will put it, was to make everything in Illustrator. “Finder windows, Facebook, Chrome…we did all of that in illustrator, and a few things in Photoshop. We overcut our animatic with those new files, and then animated them in After Effects.”

Moving to Illustrator allowed the team to use vector graphics instead of raster graphics. Vector graphics have the benefit of being infinitely scalable, so Nick and Will could “zoom” in as close as they wanted and everything remained sharp.

Some of their uses of Adobe products were more unconventional than others; when it came to subtle glitch-transitions on the Facetime footage (this is a huge trope used within the genre to use multiple takes in a single shot), they initially tried datamoshing but the results looked too blocky, better suited for a horror film.

To get the right effect, Will said that they would run the footage through Media Encoder seven times. “Sometimes we’d do three steps: we’d have a pass at one megabyte per second, then half a megabyte, then 0.1 MB.”

The future of on-screen storytelling

The desktop-film has opened up an entirely new avenue of storytelling, especially for low-budget filmmakers since there are minimal budget restrictions.

Nick was quick to point out that while it’s been a successful tool for them, it’s still just a tool to help tell stories. “Keep the focus on story, but also push the boundaries of the screen as far as you can. Because I think we found the more we pushed it, the better it got. And I don’t think we came anywhere close to the limits of that, actually.”

He made one more note to future on-screen filmmakers—keep it realistic. “As our generation is getting older and we’re making the movies, we’re less willing to accept shitty knock-off versions of these websites and social media. So if you’re going to do it, you’ve gotta earn it. You have to do it the way it actually is in the real world. It has to have some realism, otherwise, people aren’t gonna buy it”.

One other great thing about a film like this is the amount of Easter eggs and inside jokes the two were able to sneak into the film (there’s an entire sub-story about an alien invasion that appears throughout the movie!) They snuck so many in it caused their producer Sev to often joke he would murder them, which led to my personal favorite Easter egg: a news ticker that says “Hollywood Producer is suspected of murdering feature-film editor.”

But I don’t think Will and  Nick need to worry about art imitating life. As of this writing, the film has a 94% Rotten Tomatoes rating and an opening weekend gross more than 3x what Sony paid for the film at Sundance.  Given that success, I’m sure the producers of this (and many other films) will be on their respective social devices reaching out to Will and Nick soon.

Except for BTS photos, photography by Irina Logra.

Andy Young

Andy Young is a comedy director & editor in Los Angeles. He's recently worked with David Zucker, CollegeHumor, SoulPancake, Cracked, and Fusion - among many others. In addition to Frame.IO, he also writes for Moviemaker Magazine. He loves cutting on Premiere, tell him why he should/shouldn't @AndyYoungFilm.