Sports physicist Eric Goff explains how updates to the technology can help referees to make the toughest calls

AI and Euro 2024: VAR is shaking up football — and it’s not going away

VAR technology uses cameras, sensors and artificial intelligence to help referees make decisions.Credit: Gerrit Van Cologne/ANP via Getty

When the UEFA Euro 2024 football tournament kicks off tomorrow, the all-seeing eye of artificial intelligence (AI) will be glued to the action stronger than the eyes of even the most ardent fans. Referees will be able to track any slight motion made by the ball and players thanks to the latest video assisted referee (VAR) technology.

Since its introduction in 2016, VAR has been hotly debated among football fans. Inconsistencies in the way referees apply the technology, and the time they sometimes take to make decisions has fueled discontent. The English Premier League even held a vote last week on whether to scrap VAR altogether — clubs ultimately voted 19 to 1 in favour of keeping it, but the opposition highlighted the need for improvements.

An upgraded, semi-automated version of the technology incorporating more-advanced AI and a real-time location-tracking chip embedded in the ball was first used in a major global tournament in the 2022 FIFA World Cup in Qatar. Euro 2024 will feature its latest iteration. Nature spoke with John Eric Goff, a sports physicist at the University of Lynchburg, Virginia, about how far VAR has come and what the future holds for it.

How does AI observe a football match?

I think the perception when people hear ‘AI’ is of a sentient being working alongside humans. But these are algorithms and machines capable of rapidly processing large amounts of data that a referee wouldn’t otherwise have access to. Above the football pitch, 10 cameras under the roof placed throughout the stadium are able to look at 29 locations on each player's body. So with 22 players on the pitch, there are more than 600 points in motion. Fifty times per second, this data will be fed into a computer. All these different cameras can essentially tell you in real time where the players are located on the pitch, where the ball is and how fast the ball, players and their body parts are moving.

What information can the ball itself provide?

If you were to cut the ball and look inside, you would see this little sensor at the centre attached by wires to the outside shell. This little inertial measuring unit notes the location of the ball and its movements. It beams data at 500 hertz, which is 10 times faster than the cameras that are being used in the stadium. This data can be combined with the camera rendering to track the ball’s location with respect to a player’s body. The chip inside can determine the precise time and point of contact whenever the ball receives an impulse from a kick or a player's hand. This will be crucial for deciding really tough calls on a goal or a handball.

There was a famous case of Cristiano Ronaldo scoring a goal for Portugal in the last World Cup. It looked like he had gotten a header on the goal, but the chip showed that the ball did not actually make contact with his head, so the goal was awarded to his teammate.

How is all of this data useful to referees?

One major application is detecting violations of the offside rule (known as semi-automated offsides). The AI can render the 29 points of data it collects for each player in three dimensions. Using this, it can kind of fill out the player’s form beyond those 29 points on the basis of algorithms for skeletal structures. And then it can start to look at where a given body part is in the plane that's going to represent an offside.

For goal-line technology that determines whether a goal has been scored or not, there might be cases in which the ball is cradled by the goalkeeper and cannot be seen by the cameras. If you know the position of the ball, the 3D aspect of the ball can be rendered in the computer to work out if it’s over the line.

AI can make these decisions quickly so that the referee knows whether the player is offside or not, or if it’s a goal or no goal. The old technology took roughly 70 seconds on average to return the computer-generated determination of offside. Now it’s projected to be under half a minute, so referees will be able to make these calls faster.

Does the chip affect the aerodynamic behaviour of the ball?

It shouldn’t. It is fairly well-placed in the centre of the ball and there should not be any real mass asymmetries associated with the attachments to the shell. As for the weight, the rules of the game specify around 410–450 grams ball weight at the start of the match. The chip itself is around 14 grams, which is negligible. The ball has been well-tested for the kind of kicks and pressures it’s put under, at speeds of more than 90 km/h before it’s put out to the pitch for play.

What are the margins of error with this technology?

There’s always going to be an issue of coverage: are there enough cameras? Is the field wide enough per camera to get every single little iota of the action recorded? Even the algorithms that are being used to do the rendering are going to have some kinds of error; 29 points isn’t every point on the body, so there’s going to be an error associated with forming a 3D render of the body. But we’re talking about very small errors, less than half a centimetre.

How has VAR changed the referee’s role?

I think the important part of ‘semi-automated’ technology is the ‘semi’ part. There’s still an element of human determination. The one thing that the AI is not going to be able to decide is player intent. If a player’s hand gets on the ball, that’ll tell the referee that the ball has been touched. It might not identify exactly that it was the hand. So there also be times when the referee can look at a replay and determine that there was maybe some malicious intent on a given player that’ll help to determine a yellow or a red card. And of course, that’s still going to be subjective, and fans will always argue about it.

What does the future hold for refereeing, will it ever be fully automated?

I don’t think we’re anywhere close to having robotic referees. Things such as fouls and yellow or red cards still require human decision-making. As for the future, I’ve heard some interesting ideas involving virtual-reality contact lenses that could be used by referees to show the kind of data provided by AI in real time. The faster the computational power gets, the more you can take massive data sets and render them very quickly.

doi: https://doi.org/10.1038/d41586-024-01764-4

This interview has been edited for length and clarity.

This story originally appeared on: Nature - Author:Sumeet Kulkarni