It was as if the scene was designed to highlight the inadequacies of how the NFL measures first downs.
Late in last January’s AFC title game, with his team clinging to a one-point lead, Buffalo Bills star Josh Allen lowered his left shoulder and plowed forward to try to convert a fourth-and-1 quarterback sneak. Allen vanished into a sea of supersized bodies, making it difficult to figure out exactly where the football should be spotted.
At the same time that Allen popped up extending his right arm to signal that he’d gotten the first down, Kansas City Chiefs defenders celebrated as though they stopped him. The line judge on the far sideline ran to a spot just beyond the first-down marker. The down judge on the near sideline ran to a spot just behind the line to gain.
“I think he got it,” play-by-play voice Jim Nantz said on the CBS broadcast.
The officials decided otherwise.
[Join or create a Yahoo Fantasy Football league for the 2025 NFL season]
Line judge Jeff Seeman yielded to down judge Patrick Holt on where the ball should be spotted. After a replay review, referee Clete Blakeman announced that the ruling on the field would stand. The controversial turnover on downs proved to be a pivotal moment in the Bills’ fourth playoff loss to Kansas City in the past five seasons.
“I mean, that’s a possession,” Bills head coach Sean McDermott said after the game. “We’re up one point, I believe, at the time, a chance to go up maybe multiple scores. It’s a big call. Yeah, it’s absolutely a big call.”
Since the formation of the NFL more than a century ago, the task of officiating games has fallen entirely to stripe-shirted humans. Now, with AI and motion-capture technology leaping forward at a breakneck pace, the NFL is exploring the feasibility of removing human error from key aspects of the officiating process.
The push to modernize began with finding a replacement for the antiquated chain gang. Starting this fall, the NFL will no longer use two orange posts connected by 10 yards of chain to measure if a team got a first down or not. After officials manually spot the ball, high-definition cameras installed at every stadium will track its precise location relative to the line to gain, saving time and improving accuracy.
Optical tracking technology that captures the movements of athletes and the football in real time could allow the NFL to automate more officiating decisions in the near future.
The league has begun looking into whether the Sony Hawk-Eye optical tracking system can calculate where the football should be spotted if a punt sails out of bounds, NFL vice president of football technology Rama Ravindranathan told Yahoo Sports. Other possible uses for optical tracking, according to Ravindranathan, include determining whether a player has thrown a backward pass, identifying illegal formation or illegal man downfield penalties or assessing if a quarterback has left the pocket on potential intentional grounding calls.
Those, in Ravindranathan’s words, are “super aspirational” goals. Then there’s the challenge that Ravindranathan describes as even more “complicated.”
Fans have long clamored for technology to precisely determine where the ball should be spotted on plays like Allen’s infamous quarterback sneak last January. The league is “overdue” to make that change, NBC’s Mike Florio wrote after the AFC title game.
So if there are six high-resolution Hawk-Eye cameras tracking the football at every NFL stadium and a coin-sized chip in every NFL football transmitting data on its location, why hasn’t the league implemented a real-time automated ball-spotting system yet? The roadblock, Ravindranathan says, is “the very nature of football.”
Tracking the location of the football is not enough. Any system would have to accurately detect the location of the football at the precise moment that the ball carrier’s knee or forearm touches the ground or an official blows his whistle to signal forward progress has been stopped. And it would have to communicate its conclusion to on-field referees almost instantly.
“The technology isn’t quite there yet,” former NFL vice president of officiating Dean Blandino told Yahoo Sports. “That’s the next technological iteration that we’ve got to figure out.”

The first AI official
The event that ushered in the era of automated officiating across sports was a high-profile 2004 U.S. Open quarterfinal between Serena Williams and Jennifer Capriati. Four blatantly botched calls went against Williams in the decisive third set alone, including an audacious overrule by chair umpire Mariana Alves on a ball well inside the line.
“Hawk-Eye please," commentator John McEnroe exclaimed on TV after the last of the blown calls. "This is getting ridiculous.”
Hawk-Eye’s camera-based ball-tracking technology was the brainchild of a former professional cricket player with a PhD in artificial intelligence. Paul Hawkins originally created Hawk-Eye in 2000 to enhance TV broadcasts of cricket matches. It soon spread to tennis broadcasts, enabling TV viewers to verify whether line calls were correct.
The 2004 U.S. Open fiasco compelled tennis governing bodies to begin testing the accuracy and efficiency of the Hawk-Eye system as a potential aid for umpires. By October 2005, the International Tennis Federation announced plans to use the Hawk-Eye system to review disputed line calls. Now, three out of four grand slam tournaments no longer use line judges and rely solely on electronic line calling.
The robot referee revolution that started in professional tennis has gradually spread to many major sports.
AI systems have been designed to identify every grab or rotation in a snowboard halfpipe competition, every jump and spin in a figure skating program, every tumbling pass or leap in a gymnast’s floor exercise routine. In many cases, those same AI systems then score the athlete based on technical excellence and degree of difficulty.
Technology that verifies whether a ball has completely crossed the goal line is currently used in top European domestic soccer leagues and at major international competitions. Optical tracking and AI analysis also are used by VAR officials to confirm or overturn referees’ close on-field offside calls.
Major League Baseball is testing an automated challenge system that allows players to request an umpire’s ball or strike call be overturned. The league tried the system in spring training this past spring after using it in the minor leagues for the past few years.
The NBA has already begun using Hawk-Eye’s optical tracking system for goaltending reviews to confirm or refute whether the shot was on an upward or downward trajectory when it was blocked. The league and its partners are exploring how to develop the technology necessary to automate difficult out-of-bounds calls that currently bog down games with lengthy, sometimes inconclusive reviews.
Right now, the camera technology “isn’t there yet,” Evan Wasch, the NBA’s executive vice president for basketball strategy, told Yahoo Sports. Hawk-Eye uses cameras to capture the ball and 29 skeletal points on a player's body. The NBA would need even more precise tracking to determine in real time whose finger tip knocked the ball out of bounds when three players vie for the same rebound.
One potential solution that the NBA is considering is inserting a chip in the basketball that is small enough to go undetected by players yet powerful enough to detect even the slightest finger tip graze. The way Wasch sees it, the combination of that and more detailed optical tracking data could be enough to help the NBA determine quickly and accurately when a ball is touched and by whom.
“If we’re able to put a chip in an NBA ball, it can really support a lot of automated officiating technology,” Wasch said. “Now you’re not just relying on camera technology. You also have the really sensitive accelerometer in the ball to detect those light touches.”
To Wasch, there's a double benefit to automating certain officiating decisions.
“If we can get to a point where an automated system is making those determinations in real time, then you could actually train your referees that they don't have to focus on those things,” Wasch said. “Instead they can focus their attention on the much more difficult judgment calls that they have to make in every game.”

The testing ground
There’s one key advantage the NFL has when developing or evaluating new technology: The existence of a second-tier league willing and eager to serve as a proving ground.
The United Football League is an eight-team spring football league formed in 2023 when the latest iterations of the XFL and USFL merged. It has built upon the legacy of the original XFL, which introduced all sorts of innovations to football, some ahead of their time like the Skycam or mid-game sideline interviews, others never to be seen again like the “kickoff scramble” or player nicknames on the back of jerseys.
The second XFL unveiled new kickoff rules, eliminating the long run-up for the kicking team in an effort to encourage returns and reduce high-speed collisions that caused concussions and other injuries. The NFL borrowed heavily from that concept when it changed its kickoff rules before last season.
The UFL also was the first to discard the outdated chain gang, using six high-definition cameras to measure the ball’s location relative to the line to gain in less than half the time required by three men in NFL-issued pinnies. The NFL studied the success and efficiency of the UFL’s TruLine system for the past two years before modeling its own first-down measurement system after it.
“We have a close collaboration with the UFL where we look to them to help us with testing some of the technology,” Ravindranathan said. “Because of that close partnership, it was helpful for us to have them as the first breeding ground for exploring this.”
Innovation, says UFL head of technology Scott Harniman, “is in our DNA.” UFL executives are constantly brainstorming new ways to better the league’s on-field product or fan experience, according to Harniman. The UFL has placed MindFly body cameras on officials to offer TV viewers and the replay booth a fresh vantage point of certain plays. The league has also emphasized transparency, allowing all discussions between officials on the field and the replay booth to be broadcast during the game.
“We pride ourselves on being able to quickly turn something from an idea in testing into a viable product,” Harniman told Yahoo Sports. “Can it improve football? If it’s something that can improve the game of American football, then we want to use it.”
The innovation that Blandino hopes will come sooner than later is technology that can pinpoint exactly where the football should be spotted. He considers that a better alternative than leaving those decisions to human officials with imperfect judgment and obstructed views.
“You don’t always have an official looking right down the line,” said Blandino, now the UFL’s head of officiating. “Typically, they’re trailing the play. They’re trying to get the best spot, but it’s difficult. So if we can utilize technology to get better in that area, I think that’s a good thing.”
Those at the NFL and UFL are most intrigued by a potential hybrid solution, one that combines the precise location data from the microchip in the football with the context that optical tracking cameras can provide. An official could theoretically determine the moment a player was down using the Hawk-Eye system’s ability to track player limb movements. Then the official could use the microchip location data to pinpoint the ball's location at that exact moment.
For that to work, occlusion is the first obstacle that must be overcome. The Hawk-Eye system struggles to track the ball or players if its view is blocked by a tangle of bodies or by players and support staff standing on the sideline. Then there’s the challenge of syncing two different companies’ sets of data together seamlessly. And producing a system that can spit out a reliable conclusion in a matter of seconds.
Asked how far the NFL is from implementing a system that can pinpoint where the ball should be spotted, Ravindranathan admitted, “We are in the very early infancy stages with that.”
“Right now, we are exploring meshing the two different powerful data sources together,” she said. “If it works, great. But in the meantime, while we are in this process, maybe there is an AI entity that will get introduced or maybe there is another company that will approach us with a different solution. Given all that, it’s very hard to say whether we are five years out or two years out.”
As the NFL and other leagues explore automating certain aspects of officiating, their eagerness to remove human error raises some obvious questions: Could games someday be refereed by stripe-shirted R2-D2s? Will AI systems ever become fast and accurate enough to replace human referees altogether?
In football, as with many sports, that outcome is highly unlikely. There are too many subjective calls like pass interference, holding or unnecessary roughness that require human nuance and judgment.
To Ravindranathan, the goal is to explore opportunities to speed up the game without sacrificing quality or accuracy. It’s not to phase out human referees.
“There is a lot of subjectivity and knowledge that is required,” Ravindranathan said. “I don’t think we’ll be there at that place anytime soon.”
Comments