How Did That Get There? The Origin of the DIT (and DIT Cart)
The DIT cart. Even if you’ve never been on set, you probably know what they look like. It’s the mysterious trolley with the computer, the hard drives and media cards, and the clearest, cleanest monitors. Usually with a sign on it that says DO NOT TOUCH, or NO, REALLY, DO NOT TOUCH in enormous capital letters.
As a camera assistant, I’ve been working alongside these carts for years. I’ve studied their layouts, ordered and tested equipment for them, and even helped assemble them during camera preps.
But recently, I’ve begun to wonder where these carts came from. Who decided what should go on them, and how? Just when did we start wiring cameras to a four-wheeled hub with a person attached?
In short, how did the DIT even become a thing?
In the beginning…
“When Abby and I started, we were called video engineers.” Barry Minnerly kindly explains, “You had a video camera and that was it. There were no focus pullers, there was no second camera assistant. There was just the camera operator. They controlled zoom and focus, they did everything.”
Abby Levine agrees. “Someone would go out with a camera, they’d set the blacks and whites to auto, and they’d shoot. They didn’t take great care to correct the color and exposure and all that. We were doing ENG, electronic news gathering. Then we took it to the next level, to EFP (electronic field production).”
“The video engineer was the glue,” Minnerly adds. “They were the people who took all the gear and connected it.”
“The video engineer was the glue.”
Minnerly and Levine have been colleagues for forty years. They started out by working for independent production companies in and around New York City, doing ENG-style production. Eventually, they graduated into multi-camera shows where they could use paint control and camera matching to pay better attention to video picture quality.
“There was a wire coming back from the camera,” Minnerly explains. “It went to some other area where the video engineer would try to match the cameras, if there was more than one. If there was only one camera, they would try to make that picture look the best it could be.”
“It was all in standard definition,” says Abby. “Then, in the late 80s, high definition showed up on the scene.”
A matter of national interest
Consumer high-definition television actually dates back to the early 1980s, when Japanese engineers developed HiVision, a 1,125-line interlaced TV standard. In 1981, Japanese public broadcaster NHK demonstrated HiVision for the first time in the United States, prompting then-President Ronald Reagan to declare it a “matter of national interest” to introduce HDTV to the United States.
If HD was a matter of national interest, it proved to be a low-priority one. For years, HDTV languished in the US. Then, near the end of 1985, producer Barry Rebo was scouting a project in Paris and happened to see the technology while visiting a friend. What he saw shaped his creative future.
Rebo became Sony’s first American HDVS customer, buying one HD camera, four video tape recorders, monitors, a switcher, and an Ultimatte HD for compositing, for a cool $1.5M. Rebo and his production house, Rebo Studios, would eventually partner with NHK and become one of the biggest HDTV producers in the business, rivaling CBS and PBS.
Minnerly and Levine joined Rebo Studios around 1987 and they were instrumental in getting this advancing technology off the ground. From commercials and music videos to Wild Life Adventures for Turner Original programming and the documentary Passage to Vietnam, Barry Rebo and his twelve-person production house did it all. They were committed to Rebo’s vision of achieving “film style” high-definition video storytelling.
Although techno-political debates stalled the advent of HDTV for nearly ten years—from 1987 to 1996—Rebo and his crew kept the faith. During this time, Minnerly and Levine created ReStore, a HD video board which enabled any Macintosh imaging software to run in HD.
They also developed a remote iris control system, called the Stopbox, which adjusted cine-style lenses on HD and SD video cameras. All the while, Rebo Studios was churning out HDTV programs like Fool’s Fire, a puppet drama for PBS’s American Playhouse directed by Broadway legend Julie Taymor.
So is it any wonder that when acclaimed director Sidney Lumet decided to step into the world of HD, he called on Rebo Studios along with Minnerly and Levine to make it happen?
100 Centre Street
At the turn of the century, Sidney Lumet was looking to return to his roots. Before becoming one of the most revered and often-imitated film directors of his time, Lumet learned his craft by turning out live television dramas.
The young Sidney Lumet joined CBS in 1950, just two years after Milton Berle had turned the TV set into a must-have item. After a technical apprenticeship, he directed episodes of Danger, a mystery anthology series, and You Are There, a historical presentation told in the style of a news report. Both were filmed live.
His productions also included live broadcasts of Tennessee Williams’s The Fugitive Kind and Robert Penn Warren’s All the King’s Men. In 1957, Lumet left television to direct 12 Angry Men, a feature film that had originated as a live broadcast. It earned him his first Oscar nomination.
Ironically, it was the use of film that drove Lumet away from television and into full-time feature-length production. Live drama broadcasts had no real home at the end of the 1950s. A filmed show could be moved over to the lucrative rerun market, but a live broadcast tended to disappear into the ether after it was aired.
At the time, the only way to distribute a live broadcast was to use a kinescope, which was essentially a film camera pointed at a TV screen, filming the live broadcast. Kinescopes were notoriously mediocre in quality, hence the industry’s move to pre-filmed television.
“To me, filmed TV combined the worst aspects of both worlds,” Lumet later told the New York Times. “You lost the advantage of live performance. And you never had the time to go for the visual perfection that you can in movies.”
“Filmed TV combined the worst aspects of both worlds.”
But in 1999, Lumet’s interest in live performance was reawakened, when NBC asked the director to write a TV pilot. The result was 100 Centre Street, a story about the people who inhabit a Manhattan night court. Although NBC eventually passed on the pilot, A&E decided to pick it up in turnaround, marking it as the network’s first dramatic television series.
Lumet was at first reluctant to take on such a long commitment. Creating a TV show is time-consuming, much more so than a traditional feature film. But he had a change of heart when he saw the sharpness and detail of Sony’s HD system.
“What was so impressive about this Sony camera,” Lumet told NYT, “was that it gave me what my eye saw. That began to thrill me because being a primarily realistic director and not a high-style director, that reality has always been important to me.”
New tech, old tricks
Lumet agreed to direct and produce 100 Centre Street on the condition that he could go back to his old ways, at least partially. He decided to film the show in a modified live-television style, using three HD cameras. This setup allowed the actors to perform entire scenes without interruption. Lumet was hoping to recapture the spontaneity that characterized his earlier works.
Which brings us back to Minnerly and Levine, who had to design a self-contained 33-foot production truck for use at the studio and on location. Inside the truck were four HDW-F500 24P HD VTRs, one of which Lumet used to capture the live line cut.
He directed via a Snell & Wilcox HD 1010 switcher. Simultaneously, the show was recorded from each of the three CineAlta cameras onto their own videotapes. The plan was to use Lumet’s line cut as reference, and afterwards the show would actually be cut using the HD camera masters.
The Cube. Possibly the very first DIT Cart
“When we started on 100 Centre Street, nothing was portable,” Levine recalled. “We effectively built this truck with all the bits and pieces we needed. The recorders were very big. Camera control was not huge, but big enough.”
Minnerly and Levine had been dealing with HD almost exclusively since 1986. They were two of the few engineers in the United States that had experience troubleshooting high-caliber HD video productions.
Because 100 Centre Street would often need to shoot on location, the duo planned to run up to a thousand feet of fiber optic cable to get the Sony F950 cameras’ HD signals back to the truck. To their dismay, Sony told them they were unable to deliver the new cameras to anyone by their first day of shooting. They had to come up with a different plan.
A&E was determined to shoot 100 Centre Street in 24P, so the production rented three existing HDW-900 cameras from Plus8 Video. Now, Minnerly and Levine had to figure out how to get the signals back to the production truck.
Their solution was to set up a rolling cart right next to set and run a 100-foot cable harness from each of the three cameras. Each harness carried color control, genlock, intercom, and SDI signals, and fed video to a Evertz X-HD9504 router, which became known as “The Cube.” The Cube converted the HD SDI to an optical signal which was fed by fiber optic cable back to the truck.
“Sometimes we did things like drop the fiber out of a fourth-story window,” Minnerly says. “We could leave the truck downstairs and only bring a minimal amount of gear into the location. That was a big help, because the locations were usually small.”
An HD monitor was also set up on the Cube so DP Ron Fortunato could see how the cameras were handling the lights. Eventually, a still store was added so he could match the look of every scene, even if the scene in question had been shot months earlier. A Leader LV 5152DA digital waveform monitor was also set up inside the truck so Minnerly or Levine could check the specs of the HD signal.
Having the Cube on set gave 1st AC Kent Miller the freedom to set up all of the cameras before the remote truck even arrived on set. Having a high-quality monitor nearby also gave Fortunato the ability to start lighting as soon as possible. Often, Lumet would arrive and be able to start planning camera shots immediately. If this sounds familiar, it’s because it is very close to the way we work on TV and film sets today.
As is often the case with innovations, A&E had some concerns about this HD setup. Live-on-tape multi-camera had worked great in sitcoms for years, but it was an odd technique to use for a prime-time drama series. But the network’s concerns proved to be unfounded. 100 Centre Street became the first TV show to use multiple 24P cameras in the studio and on location, and their HD system helped save production 50 percent over comparable film costs.
DIT One and DIT Two
“We had just about gotten through the first season of 100 Centre Street,” Minnerly recalls, “when Local 600 came to us and basically said, ‘We’re going to create a union position for you.’ Abby and I were the only two guys on the show that there wasn’t a union position for. We originally wanted to be called Digital Imaging Engineers.”
“We couldn’t use the word ‘engineer’ though,” Levine points out. “Local 52 was already using ‘engineer.’”
“But we could use ‘technician,’ so Abby and I became DIT One and DIT Two,” Minnerly laughs. “We continually argue about who was number one.”
Now that the Digital Imaging Technician was an established union role, Minnerly and Levine had to chart a path through this new “film style” HD world that they had helped create.
“None of the handful of us who did this work owned any equipment,” Levine explains. “It was all prohibitively expensive in the early days. So I would go to the rental facility, and say, ‘I need a switcher, I need a waveform monitor, I need a high-quality monitor, I need a recorder, and I need a cart to build all this on.’ You had to piecemeal it together depending on the job you were doing.”
“You had to piecemeal it together.”
“Independent people like myself had to design and build a cart for every particular job,” Levine continues. “It was not an insignificant part of the prep. That persisted for years, until equipment started to become more affordable.”
As technology advanced, HD equipment quickly became smaller and more universal. “Instead of having to dismantle a cart and rebuild it every day,” says Levine, “you could roll it on and off the truck as-is. Those carts weren’t universally designed like they are today. But equipment became more flexible and cameras started recording on-board. All of this conspired and gave people the ability to buy the equipment themselves.”
Panavision’s Genesis and on-set color correction
100 Centre Street was far from the only project using HD video at the turn of the century. In 1999, George Lucas was cutting digital footage into Star Wars: Episode 1 – The Phantom Menace and he decided to go fully digital a year later for Attack of the Clones. A whole new world of “film style” HD video seemed to be approaching.
“Movies were being shot using high definition,” Levine explains, “and there was technology to transfer it back to a film negative so it could go down the traditional film post production route. But the idea of shooting electronically and then going back to film became kind of archaic.”
“Filmmakers wanted to stay in the electronic universe, but they were losing the flexibility of having a film negative.”
“Filmmakers wanted to stay in the electronic universe.”
The answer to this was Sony and Panavision’s Genesis, the first digital camera designed from the ground up with the express purpose of making motion pictures. The key to the camera’s new technology was its large format super-35 single-chip sensor, and its ability to use a wide variety of cinema lenses.
Importantly, the Genesis was also not tethered to an external recording system. Free from excessive cables, this new camera gave filmmakers the flexibility they were used to from using traditional film cameras.
But the Genesis also sported an important on-set advancement: the GDP, the Genesis Display Processor. It was the first cinema on-set full RGB color correction box that would take LUTs (look-up tables) and spit them out live to a monitor. With the GDP, filmmakers could now shoot in PanaLog color space and quickly see their color-corrected image live on a high-quality monitor.
“It wasn’t an interactive, live previewing box,” Levine points out. “It wasn’t adjustable. You had to build a LUT and put it into this box to use it. The GDP had maybe six preset LUTs that you could load into it. That box lived on my cart and a Log picture would come back from the camera. You would feed the image through this box, and then you could select any of these six LUTs, but you couldn’t make live adjustments to anything.”
The GDP was a step in the right direction, but it didn’t seem to be enough.
“The image was viewable, but inflexibly so,” Levine says. “You knew that you would have that flexibility afterwards, but you couldn’t carry what you were doing on set into the post universe.”
After seeing the GDP in action, other companies began developing their own LUT boxes. Levine realized that Blackmagic’s HDLink Pro converter potentially had live, adjustable color correction capabilities. After some experimentation, he and programming partner Martin Port wrote the LinkColor software, which became the go-to solution for on-set color correction for years. Eventually, it was unseated by Pomfort’s Livegrade software.
“All of this led to the DIT becoming a connection between the cinematographer’s vision and preserving the image for post,” Levine says. “This graduated into previewing color live on set and taking great care to calibrate monitors and equipment. It all fell under the DIT’s purview.”
“It all fell under the DIT’s purview.”
The HD transition was not painless. Many cinematographers at the time possessed a certain amount of mystique and some of them wanted to keep it that way. But others were excited to embrace the technology while also imposing their artistic look. “These DPs did not want to have a gap in the middle where an unsupervised colorist could decide what their film would look like,” says Levine.
“The two main reasons we were on the job,” Minnerly adds, “was to minimize the amount of color correction needed in post, which was going to save the production money, and to have the pictures look how the DP wanted them to look. That way, everybody gets to see a nice looking picture and we can send all that information off to the post house. Then, all they need to do is apply a LUT and hopefully things are right. They don’t need to do a ton of extra work.”
What’s on your DIT cart?
Modern-day DITs have many responsibilities including—but not limited to—image management, video routing, color grading, data movement, quality control, and networking. A good DIT uses a keen knowledge of software and technology to efficiently shepherd the DP’s vision into post production, and they do it all from that mysterious DIT cart.
According to Levine, his cart contains “a couple of 24-inch Sony OLEDs—they’re not high dynamic range and they are not 4K, but they represent the color accurately—a traditional waveform monitor, Teradek COLRs, a routing switcher so that I can look at Log pictures and corrected pictures and juggle them around, and frame capture devices so I can grab stills and video.”
“A good DIT uses a keen knowledge of software and technology to efficiently shepherd the DP’s vision into post production.”
“There’s other things I could have,” he admits, “I’ll probably have to upgrade some hardware for my next show. But I’ve worked with this DP more than once. He’s happy to work this way.”
Additionally, a modern-day DIT cart may have wireless video receivers attached (Teradek is a current favorite) or RAID hard drives for downloading and data management, but not always. Wireless video and media downloading is sometimes the purview of the Camera Loader, another Local 600 position. Having Loaders on set can allow the DIT to concentrate on more important things, like assisting the cinematographer, or grading the live camera signal and then saving those corrections as CDLs (Color Decision Lists).
“More than anything,” Levine adds, “a DP wants to know that somebody has his back.”
The future of the DIT
Advancements like Frame.io Camera to Cloud and increasing multi-departmental connectivity are driving innovation on TV and film sets, and DITs will almost certainly be the ones that productions turn to for guidance.
As remote work continues to grow in post-production, so too will the demand for things like robust on-set internet, on-the-day editing, and device-based image viewing. The responsibility for these emerging technologies, and their inevitable troubleshooting, is most likely to fall on the DIT’s shoulders.
While these changes seem inevitable when it comes to things like live broadcasts, trade shows, and even camera tests, what about good old-fashioned narrative filmmaking? Will that landscape really look that different in the future?
It’s hard to tell, but consider this: people are saying that streaming is just turning into cable all over again. There’s also an increase in creators advocating for traditional single-camera filmmaking over the ubiquitous multi-camera approach.
Even the videos exploding across YouTube and TikTok seem to echo the zany musical shorts that used to be shown between vaudeville acts and inside of nickelodeons. Just like with Sidney Lumet and his “live” HD video cutting, emerging technologies may just be giving us new ways to do the same thing all over again.
Whatever the case may be, that technology will always be moving forward, and the DIT and their trusty cart will be rolling (physically and metaphorically) alongside it.