Made in Frame: How This Sports Photographer Beats the Clock

Editor’s note: Today’s Made in Frame installment welcomes guest writer Dean Rutz, photographer for The Seattle Times. He’s been described as “the Chuck Norris of Seattle sports photographers. Dean doesn’t capture photos, photos surrender to Dean.” Likewise, we willingly surrender to his expertise as he describes his experience using the Fujifilm C2C workflow for Game 5 of the Stanley Cup playoffs featuring the Seattle Kraken. Since then, he’s continued to explore this new way of working, covering games for MLB, and has generously shared his impressions and insights with us.

As a photographer who covers major sporting events there’s one overarching truth: speed is everything. My newspaper wants a steady stream of images to use for multiple purposes: social, mobile, and print. The traditional approach to deadlines—getting pictures into the newspaper in time to make press—is long outdated. In today’s news cycle, every minute is a deadline.

Anything that lets me work faster is going to help me meet all the new expectations that news and sports photographers face in a time where viewers want content on demand. Because if they don’t get it from one service, they’re more than willing to search for another.

That really becomes the argument for Fujifilm and the C2C workflow: empowering the editor with the speed and reliability they need to make decisions for the audience they serve—whether they’re working locally or remotely.

To be clear, this is a new way of working and there are still workflow issues to be addressed and improvements to be made. That’s part of why Frame.io asked me to write this piece—to show what’s working now, what could work better in the future, and how this changes the way media outlets can compete within their own market.

Game 5 – The Seattle Kraken

On April 26, 2023, Game 5 of the Stanley Cup Western Conference Round playoffs saw the Seattle Kraken take on the reigning champs, the Colorado Avalanche, at Denver’s Ball Arena.

The headline is that C2C worked flawlessly. It was a very big deal because Ball Arena in Denver has no known mechanism for sending in-game pictures. (It’s different in the NBA configuration because they can add ethernet drops to the floor.)

But for hockey? I had previously attempted in Games 1 and 2 to send from the ice using a Verizon MiFi hub wired to a Canon R3, and I had no success even connecting to a network. But in the exact same positions when connecting my FUJIFILM X-H2S to only my cell phone, pictures flowed easily in-game.

Which meant that I was the only photographer in the arena who was sending live images.

It might have been possible to dismiss me using Fujifilm cameras on Game 1 [which took place in Denver] as very few photographers there knew me. And in the first two games in Seattle, I was on the Canon platform and working in the same space as everyone else.

But when I walked into Game 5 with Fujifilm cameras exclusively, people noticed. The League photographers, Getty, and other newspaper photographers asked questions. They laid hands on cameras and looked at the results.

In today’s news cycle, every minute is a deadline.

Of course, it’s difficult in the compressed environment of a championship series to have deep conversations about cameras and technology. From the minute we walk into the building, we’re all under stress and deadlines. But there were several things that people picked up on right away. For one, everybody liked the design and feel of the FUJIFILM X-H2S, which was amplified when they put the 200mm f2 on that body.

I traveled with a pair of FUJIFILM X-HS2 bodies, one X-H2, and 10-24mm, 50-140mm, 200 f2, and 150-600mm lenses. Hockey is a brutal sport in terms of being able to cover the ice from a 6-inch wide hole that in most cases can only see about 20 percent of the ice, and a puck that’s flying around the ice so fast you don’t really know where it is at any given time. It’s humbling. And I won’t say this was my best game by any stretch of the imagination, but I did all right.

And so did the X-H2S with the 50-140. It was my primary lens on ice, and it tracked exceptionally well. You’re nothing without a quality 70-200 equivalent, and in that regard the 50-140 acquitted itself very well.

What really got people’s attention, however, was the C2C integration. Honestly, it even surprised me, given I’d had no success in two previous games there. The fact that I could send images directly out of camera got everybody to turn around in their chairs. And on the Seattle end of the game the editors who used images to update the live game story greatly appreciated it.

Testing the C2C system

Since April, I’ve been further experimenting with Camera to Cloud as an editing solution for remote sports cameras. I’ve only begun to scratch the surface but it’s clear that the potential it brings to a publication on tight deadlines is undeniable. If you look at it as a system, it offers enormous benefits to creatives, whether in the field or behind the editor’s loupe.

I was recently able to test out the system at the MLB venue T-Mobile Park, home to the Seattle Mariners, where they’ve installed Ethernet ports into the first- and third-base wells in advance of the upcoming All-Star game.

I had decided long ago that a remote camera setup on first base would be the most productive remote I could have. While home plate might be the glory shot, there really isn’t a good angle on home in most MLB stadiums that can’t be achieved from your regular camera position. Beyond that, it’s pretty rare to ever have a play at the plate.

First base, however, often has errant throws, or pick-off attempts, or runners beating out the play. But it’s also true that it’s a hard picture to make any other way than by remote. If you’re on the batter at home and he hits an infield grounder such that there might be a play at first, the likelihood of switching from a 400mm to a 135 in the few seconds it takes him to get there isn’t very high. It’s much easier to simply hit that remote button if you think there’s going to be a play that’s worth having.

At most you might have a half dozen legitimate plays at first during a game, so the volume of pictures that would actually need to go to the cloud is relatively small. That made it a good test case for trying C2C in game—especially in light of the MLB’s new rules to speed up the game, which have greatly reduced the amount of time I can edit while play is in progress.

For example, when someone hits a home run I’m expected to get the picture into the newspaper quickly to go with live game updates online. I also send them to the beat writer who posts them to his running game story on Twitter. And it goes to the sports editor, who may or may not begin his page design based on whether that play will become the basis of the game story in print.

So, infrastructure and workflow are critical.

In this instance, I’m primarily covering home plate with a FUJIFILM X-H2S with a Fujinon 200 f2 with a 1.4 teleconverter giving me the equivalent of 400 2.8. On first base is the remote camera, the X-H2S, with a 90mm f2 (135mm equivalent). I’m triggering it with a PocketWizard manually. Both cameras are running about 30fps. The first base remote is on ethernet.

Given that I’m only triggering the remote when there’s a play, I only need to edit when I think the play is worth having. I estimate that at most it’s sending 10 frames per play. In this configuration the camera and cloud were both flawless. The speed upstream was such that by the time the play was concluded, and I had the time to make the edit, the pictures were there waiting for me.

In the photo above, the errant throw to first goes underneath the base runner’s leg, and he advances to second base on the play. A good baseball picture any way you look at it.

While C2C passed the test with flying colors upstream, the downstream aspect was less smooth. Because although the camera was connected to ethernet, the computer was on WiFi and struggled to maintain its connection to Frame.io. The solution on the next test was to subnet the stadium ethernet, getting both the camera and laptop on a wired connection.

For the third game, I decided to stress test the system by having both the first base remote and the main camera both on ethernet. This gets into the heart of C2C and sports.

It’s not unreasonable to say the average sports photographer can shoot 6,000 or more frames from a single competition. Baseball is a beast because you’re literally going after every swing of the bat. That’s the point of having a camera that can run up to 40 fps—getting the homerun ball as it comes off the bat. But who knows what swing that’s going to be? So you go on each and every one. And the number of frames rack up quickly.

In this particular game the Mariners scored four runs in the fourth inning, and that camera pointed at home was furiously making frames—hundreds and hundreds of frames, all of them going up to the cloud. By the end of the inning, the system was solidly 160 pictures behind in upload. Because the computer was hooked up to ethernet, it was pretty easy for me to monitor the progress of uploads.

What was great was that the Fujifilm camera and transmitter were behind more than 160 images without camera performance declining at all. It’s hard to imagine any other camera not choking under the weight of that many unprocessed or unsent full-sized JPEGs. But at no point did it ever impact my shooting.

While the camera was approximately 160 frames behind in upload, after the half-inning when I switched cameras from offense to defense (by definition drastically lessening the frames shot) I timed out how long it would take to send those 160 frames—approximately six-and-a-half minutes—which meant that it took about two-thirds of the next inning for them to be completely uploaded. Is there another system that could do it faster?

For smaller agencies…C2C offers the ability to compete with any service.

For larger agencies, that kind of end product is relatively easy to achieve because they have the resources to commit whatever assets are required to achieve it. But for smaller agencies (which, admittedly, is 99 percent of all media) C2C offers the ability to compete with any service.

Room to grow

Nothing new or game changing (pun fully intended) comes without some growing pains—both on the part of the end user and on the manufacturers and developers. As an early adopter of the C2C workflow, I’ve shared some of my insights with both Fujifilm and Frame.io to help them further streamline the functionality and user experience.

For one, without IPTC metadata, it was difficult for my editors to understand what they were looking at without my giving them an explanation which, of course, I didn’t have the time to do. Second, retrieving images from Frame.io required them to leave their publishing platform, requiring an extra step. It’s not to say that they wouldn’t use it—especially for satisfying quick social media or mobile deliverables—but that it will require an adjustment to the way we’ve been used to doing business.

And then there are issues around the way cameras are synced and images are organized. The way it works currently is that if you’re using multiple cameras, a folder is created for each one. In theory, that’s fine, but for my workflow I want to be able to have the files from multiple cameras going into a single folder. Why? Because if I’m recording the same play from multiple angles, I want to see each angle in roughly the same time to be able to quickly compare which is best.

Still, with some real-world adjustments to the gear and the workflow, Adobe, Frame.io, and Fujifilm have the potential to create a paradigm shift in the industry. Making the system work better is just one part of the equation. The other more pressing issue is communicating to an industry how leveraging this technology can provide economic benefits there are few other ways to realize.

Dean Rutz

Dean Rutz joined the Seattle Times in 1988 as picture editor, and in the years since has done just about every job that can be done in the newsroom. But starting in the mid-90’s he began gravitating toward sports, and that’s where he spends most of his time today. Rutz has traveled to a dozen different countries for The Times in pursuit of sport, and the stories behind the games. He is a veteran of numerous Olympic Summer and Winter Games, Super Bowls, countless collegiate championships, and America’s Cup races.

Princess Cruise Lines’ Wildly Successful Remote Workflow

Editing a Quest for Justice and Healing in “Black Box Diaries”

How Editor Kelly Lyon, ACE, Built Her Emmy-nominated Projects