Blog icon

28 February 2023 5 min read

This blog is an excerpt from episode five of our Everyday AI podcast.

The hush of anticipation. The pop of a perfectly centred racquet hitting the ball. The sharp squeak of shoes scuffing against the court. These are the unmistakeable sounds of a tennis match. We know them well. But could you rely on these sounds alone to follow a match?

Courtney Lewis is a blind tennis player who relies on her hearing to play and watch games. But it’s not always easy.

“It is very difficult, I have to zoom in on matches if I want to watch them,” Courtney said.

“And even then, it's hard to keep up with where the ball is. So if I'm listening to it and I can know where it's landed and how it's been hit, it's going to be a lot more engaging.”

One AI tool is helping blind and low vision audiences follow games by applying sound effects to live tennis matches. This system, called Action Audio, turns data from real time monitoring of tennis ball movements into 3D sound so that blind and low vision audience members can follow the game on sound alone.

It’s just one example of how the way we watch or play sport is evolving thanks to AI.

Tennis player Courtney Lewis, who is blind, relies on her hearing to play and watch the game.

AI is on the Moneyball

Sport is a numbers game. The points scored, the fouls, the players’ ranks, the time between hits or shots or strokes. Many of these details can boil down to formulas for predicting or strategising how a game may play out.

So it’s unsurprising that AI systems have made their way through many elements of our favourite sports. They’ve revolutionised coaching, umpiring, and the watching of games.

Stuart Morgan leads the machine learning, AI and performance technology group at the Australian Institute of Sport. His team builds AI tools that automatically provide insights analysts would otherwise have to do manually. Things like tactical game planning, player injury modelling and team formation.

One of their earliest adoptions of AI was driven by the coach of the men's hockey in 2010, Rick Charlesworth.

Rick was interested in disrupting the game by creating a competitive imbalance. His plan was to increase the interchange rate of his own players and increase the tempo of the game.

“Coaches often have a really strong intuition that something's right, but it's not always possible to measure that or empirically support their intuition,” Stuart said.

So in came Stuart and his team of AI researchers. They built a system that was able to localise the players as they moved around the hockey pitch and measure how fast they were running, as well as the different formations of each team.

With the data they collected, they could then figure out if the changes the coach were implementing were actually working. And it was.

“It was an enormously successful strategy.”

AI systems can be trained to look for and analyse the strengths and weaknesses of a particular athlete, assessing information like endurance, temperament, speed, flexibility, nutrition, their respiratory system, and more. And they can be applied to all kinds of sports.

Sound: Tennis sounds

Jon Whittle: Anyone who’s watched a tennis match will be familiar with these sounds. And you’ll know that as a spectator, you’re expected to be silent, so that the players have total focus on the ball.

Courtney Lewis: You have to think about, you know, where it lands, where you have to be, where your opponent is, you know, your tactics or you know where you want the ball to go next.

Jon Whittle: Now imagine being a player on that court… and only being able to rely on these sounds.

Courtney Lewis: My name is Courtney Lewis. I am a blind tennis player. I have very terrible depth perception and peripheral vision.

Courtney Lewis: We rely on our hearing to hit the ball.

Sound: Action Audio sounds

Courtney Lewis: There's a bell in the ball, and every time it bounces, it'll rattle. It's very hard to hear it when it's in the air because, you know, it's not hitting anything. So the bell’s very silent.

Courtney Lewis: We also have to try and analyse where it lands on the court. And then there are a million other things we have to worry about. You know how far away it is from our racket, from the ground. We're so focussed on where the ball is with the bounce and when you know when the rattle goes off, when the other opponent of hits it and when it lands, it's a game of focus and concentration.

Courtney Lewis: The audience has to be quiet, commentators have to be quiet, the umpire has to be quiet, even like it's common courtesy for the other opponent to be quiet while we're playing.

Jon Whittle: These sounds we’re listening to are inspired by the sound of blind tennis that Courtney plays…but they’re just sound effects. You can almost picture what’s happening in this game though, right?

Jon Whittle: These sound effects are being applied to live tennis matches using artificial intelligence, allowing blind and low vision audiences to plug in their earphones and follow their favourite athletes as they play.

It’s just one example of how the way we watch or play sport is evolving… thanks to AI.

Jon Whittle: Welcome to Everyday AI, I’m Jon Whittle, the Director of CSIRO’s Data 61 - the data and digital specialist arm of Australia’s national science agency.

In this podcast series we’ve been hearing how artificial intelligence works, and how it is very much part of our daily lives.

We’ll come back to Courtney a little later and we’ll hear exactly how this blind tennis-inspired machine learning technology actually works. But before we get there I want us to get a sense of the bigger picture of how AI is becoming a major player in some of our favourite sports.

Jon: Sports are numbers games. The points scored, the fouls, the players’ ranks, the time between hits or shots or strokes… so many of these details can be boiled down to digits and formulas to predict or strategise how a game or a race may play out.

So it’s not surprising, really, that AI systems have made their way through many elements of our favourite sports. They’ve revolutionised coaching, umpiring, and the watching games.

Jon: Statisticians have been numerically analysing aspects of the game of baseball, for example, since as far back as the mid-1800s. Things like batting average or number of runs, even the conditions of the day and the expectations of a crowd.

It’s in baseball where we find the origin story of AI in sport. It’s a story that you may be familiar with if you’ve seen the film Moneyball.

Stuart: The Moneyball example was the thin end of the wedge that was where people started to realise, okay, there are ways that we can with statistics and and measurable inputs, start to understand the sport better and start to understand the behaviours players better and start to look for ways to outmanoeuvre your opponents.

Jon: This is Stuart Morgan from the Australian Institute of Sport.

Stuart: The idea behind Moneyball was very simple: the coaches and scouts used intuition to evaluate the potential of players and to value players.

Jon: Before the 1990s, the teams with the most income could buy the biggest players… the high-school standouts with the best batting averages. The Oakland Athletics team, however, were working with a pretty small budget.

Stuart : They needed to find ways to outmanoeuvre the league, make better use of the resources that they had, and more efficiently deploy those resources into players that had been undervalued in the market.

Jon: They did something completely different to their big budget competitors.. They took a rigorous data-driven approach to field a team that could outsmart their richer competitors.

Stuart: Starting with a group of people who were essentially fans and who took a real interest but also had actuarial skills or accounting skills and deployed those statistical skills to look at the very rich historical collection of statistics in baseball. They were able to start to recognise that some players were actually undervalued.

Jon: The approach led Oakland Athletics to a 20 game winning streak, which was considered some kind of miracle on their low budget. This level of analytics was new in the 90s, but it inspired the integration of AI into a number of sports today. And it’s Stuart Morgan’s bread and butter.

Stuart: I lead the machine learning AI and Performance Technology Group at the Australian Institute of Sport.

Jon: Stuart develops computer vision AI systems that can turn everyday video footage of athletes into insights.

Stuart: We film training sessions, we film competitions, and coaches are routinely interested in things that are observable in the video. So broadly speaking, my role is to to lead the group that builds the tools that allows us to extract things autonomously from vision, provide new insights for coaches, make the lives of the analysts easier by automating some of the things that would otherwise have to do manually at 2am in the morning and generally try to gain some kind of a competitive advantage over other countries operating in the same kinds of space and sport.

Jon: Artificial intelligence is also being used by coaches for things like tactical game planning, player injury modelling and team formation. Stuart says that one of the earliest adoptions of AI for the Australian Institute of Sport was driven by a coach in 2010.

Stuart: The coach of the men's hockey team at the time was a guy called Rick Charlesworth, who had had considerable success previous to that with the women's hockey program. He was interested in ways that he could disrupt the game and change the nature of competition and create a competitive imbalance. And one of his ideas around that, which was a clever and multilayered approach, was to increase the interchange rate of his own players and massively increase the tempo of the game. And even to the point where he took his own best players off the pitch more regularly to give them shorter bursts of high intensity game play on the pitch. And the overall objective for this was to wear out our opponents who were playing with a more conservative style of the game. Coaches often have a really strong intuition that something's right, but it's not always possible to measure that or empirically support their intuition. So that was the challenge for us with Rick was to find a way of tracking opposition players as they moved around the hockey pitch so that we could understand whether the tactical changes that Rick was deploying was actually increasing the work rate of our opponents.

Jon: In comes Stuart and his team of AI researchers… They built a system that was able to localise the players as they moved around the hockey pitch… and measure how fast they were running, and the different formations of each team.

With the data they collected, they could then figure out if the changes the coach were implementing were actually making it much more difficult for their opponents.

Stuart: They were having to work at a much higher intensity. And because our players were interchanging so much more frequently and other parts of the game strategy were geared to favour us in that domain it was an enormously successful strategy.

Jon: AI systems can be trained to look for and analyse the strengths and weaknesses of a particular athlete, assessing information like endurance, temperament, speed, flexibility, nutrition, their respiratory system… a whole heap of stuff. And they can be applied to all kinds of sports.

Stuart: We've recently finished a big piece of work for swimming that was deployed at the Tokyo Olympics. But we were shifting ground a little bit and we're trying to build tools that are much more customisable and sport agnostic so that we can build one set of tools, make them available to the sporting network, and then people across all the different nooks and crannies of the national sporting network, whether it's golf or badminton or race walking or or something else that might not normally be a big enough sport for us to invest a lot of resources into that we're able to kind of support those sports with with sport agnostic AI tools.

Macha: In a tennis context, there’s a few different applications for artificial intelligence. And one of the most prominent is its use in officiating.

Jon: This is Macha Reid. Macha is Head of Innovation at Tennis Australia. This is the organisation that runs competitions like the Australian Open. Macha tells me that AI is transforming the roles of our umpires and referees.

Macha: if we wind back the clock, a touch and not too long ago, really, we're obviously relying on the human eye to call each and every ball on a tennis court. And then along came this thing called Hawk-Eye in the early 2000s, which revolutionised, transformed the way that our sport went about officiating.

Jon: Hawk-Eye is a computer vision AI system that tracks the ball… like a hawk… and lets the referee know if and when it has fully crossed a court or goal line. It’s now used to help eliminate errors in scoring in a number of sports like cricket, badminton, rugby union, volleyball, association football, Gaelic football, and even hurling.

Macha: In more recent years, it expanded in tennis, whereby we use electronic line calling across each and every match at the Australian Open. So what that really involves is eight or ten cameras on each court essentially communicating with one another and tracking the ball as well as the player, 50 times a second. And then what happens is that by virtue of the way that the cameras are able to communicate with one another, you're able to provide a prediction or estimate, representing whether the ball was in or out on the tennis court.

Jon: Note that word estimate. Like any AI, Hawkeye isn’t perfect. So just how accurate is it?

Macha: We're talking about millimetres in and around how precise Hawkeye can become, whereas the human eye, you're probably talking closer to the centimetre type piece. But again, that depends on where they are as a line person, you're sitting on the baseline or a single sideline, double sideline. And it also depends on the speed of the ball and the direction from which it's being hit, whereas a lot of those elements aren't as big a factor when you're using something like electronic line calling, artificial intelligence overcomes that so it can scale far more effectively than the human eye.

Jon: And what do the players think of it?

Macha: I think every single tennis player on the planet really embraces and enjoys the objectivity. No longer are they looking at a human and trying to almost pick faults, pardon the pun. Now it's essentially a machine that's far more precise and is able to operate at really high speed.

Jon: There’s another benefit to this AI tech. It’s really fun for the fans. All that AI line calling information gets packaged up to entertain all of us watching in the stadiums or at home.

Macha: How do you present information so that fans at home can better understand how remarkable performances are? How special certain moments are, and AI can really help with that…. So in a similar way, basketball's done that when you're taking shots from different parts of the court, how difficult might that three point shot be versus another three point shot taken by a different player.

Macha: The same applies to tennis. So can you actually value each and every shot or assess its difficulty in an objective way on a tennis court? Is it perfect for every fan? No, but it absolutely provides a talking point.

Jon: There’re plenty more ways in which AI has changed the audience experience of sports. Things like computers giving us live stats of games, or quickly changing camera angles to follow the action. Things to help us follow races or matches and get a better picture of what’s going on.

Which brings us back to Courtney Lewis. Because AI technology can also be used to make the experience of being in the audience more inclusive.

Courtney Lewis: I have only been in the audience for the Ash Barty game. It was insane, but it was very difficult to watch because obviously with the visual impairment. Ash is a definite inspiration for me. I also love the high performance places like Rafa and Roger Federer. But I don't watch too much tennis. I love playing. Watching isn't what I usually do.

Jon: When you’re sitting in the audience at a game of sighted tennis, with no rattly ball to follow, the sounds can be pretty hard to distinguish from one another. Not to mention, it is a big faux pas to make any noise in the audience, let alone have someone give you a blow-by-blow of what’s going on.

So while watching Ash Barty win the 2022 Australian Open was really exciting for Courtney, she wasn’t able to truly follow Ash Barty’s game in the same way as someone with full sight.

Courtney Lewis: It was it was an incredible experience watching someone play at that level that, you know, I couldn't I couldn't see too much because we were, you know, up in the stands for even if we were being if we were lower in the stands and much closer to the court, it was don't have been very hard to see because the ball is you know, it's moving so much.

Jon: And this is where the AI that uses blind tennis sound effects comes into the picture. Here’s Macha again, from Tennis Australia.

Macha: Fundamentally we want to try and make the Australian Open as a spectacle, as an experience, as accessible as it possibly can be. So with that in mind, one of our contemplations or challenges for ourselves is, okay, well, not everyone's able to watch the vision or consume it in the way they might like on the TV. So how can we bring that to life for audiences like the blind community of which there's hundreds of thousands in Australia? So what we essentially did there was we took that electronic line calling information about the ball 50 times a second and we represented that in 3D sound.

Jon: In this AI system, called Action Audio, data from real time monitoring of tennis ball movements is turned into 3D sound so that blind and low vision audience members can follow the game on sound alone.

If you’re not listening with headphones already, now might be a good time to put some on.

Sound: Action Audio tennis sounds

Jon: The first sound you’ll hear is the bell or rattle, which the computer applies every time the ball bounces. A bounce may also be accompanied by a series of short ‘blips’ to indicate how close the ball is to the line. Kind of like how a car with parking sensors will start beeping if you’re about to back into something.

Jon: When a player hits the ball, we hear a beep sound. And this sound is also positioned in 3D space, like the bounces, so that we can judge whereabouts on the court that hit occurred.

Jon: Here's what it sounds like all together, along with the live sounds of the match itself.

Sound: Action Audio tennis sounds - full match

Macha: So now someone at home who wants to listen to the commentary, radio or TV is able to track what the ball and players are doing through sound. Because previously for that to happen, they'd have to have a family friend or someone else sitting alongside them describing, ‘Hey, Rafa, just hit the ball cross-court and Rogers reply with a down the line. And then Rafa is running around and he's hitting a forehand…’ and that's really hard to do full by ball, right.

Jon: The Action Audio tech works with the live radio broadcast of certain games – so listeners can stream the live commentary, with Action Audio applied over the top. It was first trialled at the Australian Open in 2022 and is spreading to more games as the technology gets picked up around the world.

Macha: The success I think has been showcased by virtue of Queen's - which is one of the LTA Tennis Association's biggest events in the lead up to Wimbledon - picking up that technology this year and we're in discussion with the US Open to do the same because more and more sports are seeing it as an opportunity to introduce a new language really, to understand sport.

Courtney Lewis: I would watch tennis a lot more often if I had that technology and that equipment at hand, because, it is very difficult, I have to zoom in on matches if I want to watch them. And even then it's hard to keep up with where the ball is. So if I'm listening to it and I can know where it's landed and how it's been hit, it's going to be a lot more engaging.

Jon: For blind and low vision tennis lovers, like Courtney, this technology is a game-changer. At this stage most people will be using it while streaming games at home… it still takes a few seconds for the AI technology to allocate a sound to each hit of the ball. And tennis is a fast game, but as the technology advances this should improve.

Jon: So there are heaps of ways that AI is revolutionising sport and making the field more competitive, or more fair, or more accessible for our players.

But there’s one topic we need to touch on. All this AI must be trained on data to learn how to make decisions about line-calls, or player fitness, or the sound of a tennis match. And all that data comes from our athletes.

So how should we be thinking about the ethics of all this data collection, and where should we draw a line?

Toby Walsh: Performance matters significantly in sport. And it's the people on the end of sporting prowess. And therefore it's a place where many technologies are very quickly adopted, where people have already seen the value of collecting data on athletes and using that to try and optimise performance. But equally, you also have to worry about the challenges that poses, whether the data is being collected in an invasive way

Jon: Professor Toby Walsh is a leading AI researcher, and someone with an interest in the ethics of how AI is developing.

Toby Walsh: It used to be that data was collected when you were on the playing field, when you were competing. Then it was realised of course that it's worth collecting data when you're training and now data is collected even while you sleep. And that of course, you know, raises lots of risks about privacy and the like.

It’s not only data that's being collected for sporting purposes, it's data that's been collected for entertainment. So there's a lot of statistics that's being collected so that we can get a more interesting, you know, real time view of how the athletes are performing and also data being collected for gambling purposes. And there's obviously a lot of money associated with that.

Jon Whittle: And who has the power right now in that industry? I'm presuming it's not the athletes. I mean, is it the case that they're just expected to sign up to any data being collected about them that their employer wants to collect?

Toby Walsh: It depends very much on the sport and the strength of the professional bodies within the sport and the bodies that represent the athletes. But typically, the athletes are in a disadvantaged position. I mean, on one hand, of course, they're very keen to do better personally. And so they're very keen for the data to be collected. But equally, they're not in a very good bargaining position typically to be able to say, No, you can't collect this information while I'm sleeping, or I want this data so that when I leave your club, I can take the data with me to my next club. And then, you know, other things that you have to worry about long term things. For example, if we're measuring as they are doing now in many sports, the number of times that you get a knock to the head, does that mean then you won't be allowed to get life insurance and health insurance when you retire, or if you've had too many knocks to your head because that data is now known?

Jon: Here’s Stuart Morgan again from the Australian Institute of Sport.

Stuart: I think it's an important question. And it's not just an important question for the AIS. I think it's an important question for the public more broadly, as security cameras become more and more ubiquitous and face recognition technology becomes more sophisticated. We're already sharing so much data with the rest of the world, whether we know it or not. So sport is no different from that. And…

We operate from a position, I suppose, where we mostly work with footage that is filmed in the public domain. We film a race and if that race is at the Olympic Games, it's broadcast to billions of people. And so we see that as fair game, as a place where you can deploy your systems and be inventive and be innovative and try to eke out those competitive advantages that are unique and actionable.

Jon: It’s similar across all industries as AI is being developed. Just because we can collect data, or do something with AI technology, doesn’t mean we should. The challenge is for us humans collaborating with the tech to build proper regulations and ethical boundaries around our work.

Jon: I’m Jon Whittle, thanks for joining me for Everyday AI.

In our next and final episode, we’re leaving our planet to hear how astrophysicists and space explorers are using AI to learn more about our galaxies and universe, and to understand the origins of life on earth.

Raymond Francis: When the rover was hanging on cables from the descent stage just about to touch down. And the rocket plumes had just begun impinging on the surface and blowing sand and dust away. And the rover was just a couple of metres above the surface. And you're looking down into this sandstorm that is just forming in response to the rocket blast at the rover. When that photo came into mission control, people lost control of their emotions.

Jon: I’m Jon Whittle and Everyday AI is a CSIRO series created by me and Eliza Keck. Alexandra Persley is our supervisor producer. And Jess Hamilton is senior producer from Audiocraft. The Audiocraft production team is Jasmine Mee Lee, Cassandra Steeth and Laura Brierley Newton . We’d love to know what you think, so please subscribe to Everyday AI and leave us a review wherever you get your podcasts.

Share & embed this video

Link

https://www.youtube.com/embed/gcraHmL7sIU

Copied!

Embed code

<iframe src="https://www.youtube-nocookie.com/embed/gcraHmL7sIU" width="640" height="360" frameborder="0" allow="autoplay; fullscreen" allowfullscreen></iframe>

Copied!

Watching like a Hawk-Eye

In tennis, AI is improving the sport in several different applications. But one of the most prominent uses is in officiating.

According to Machar Reid, Head of Innovation at Tennis Australia, AI is transforming the roles of our umpires and referees.

“If we wind back the clock a touch, we're obviously relying on the human eye to call each and every ball on a tennis court,” Machar said.

“And then along came this thing called Hawk-Eye in the early 2000s, which revolutionised, transformed the way that our sport went about officiating.”

Hawk-Eye is a computer vision AI system that tracks the ball like a, well, you know. It lets the referee know if and when it has fully crossed a court or goal line.

It’s now helping eliminate errors in scoring in a number of sports, including cricket, badminton, rugby union, volleyball, association football, Gaelic football, and even hurling. So how does it work?

“What it really involves is eight or ten cameras on each court essentially communicating with one another and tracking the ball as well as the player, 50 times a second. And then what happens is that by virtue of the way that the cameras are able to communicate with one another, you're able to provide a prediction or estimate, representing whether the ball was in or out on the tennis court,” Machar said.

Note that word estimate. Like any AI, Hawk-Eye isn’t perfect. So just how accurate is it?

According to Machar, it’s a lot more accurate than the human eye.

“We're talking about millimetres in and around how precise Hawk-Eye can become, whereas the human eye, you're probably talking closer to the centimetre type piece,” he said.

“It can scale far more effectively than the human eye.”

Professor Toby Walsh, Chief Scientist at UNSW's AI Institute. ©  ©2022

The ethical elephant in the room

There’s no question AI is revolutionising sport and making the field more competitive, or more fair, or more accessible for our players.

All this AI must be trained on data to learn how to make decisions about line-calls, or player fitness, or the sound of a tennis match. And all that data comes from our athletes.

So how should we be thinking about the ethics of all this data collection, and where should we draw a line?

We explored this question with Professor Toby Walsh, Chief Scientist at the UNSW AI Institute. Toby says it’s an important issue we need to consider as AI technology becomes more ubiquitous.

“It used to be that data was collected when you were on the playing field, when you were competing," Toby said.

"Then it was realised, of course, that it's worth collecting data when you're training and now data is collected even while you sleep. And that of course raises lots of risks about privacy and the like.”

The challenge for us humans collaborating with AI is to build proper regulations and ethical boundaries around our work.

Contact us

Find out how we can help you and your business. Get in touch using the form below and our experts will get in contact soon!

CSIRO will handle your personal information in accordance with the Privacy Act 1988 (Cth) and our Privacy Policy.


First name must be filled in

Surname must be filled in

I am representing *

Please choose an option

Please provide a subject for the enquriy

0 / 100

We'll need to know what you want to contact us about so we can give you an answer

0 / 1900

You shouldn't be able to see this field. Please try again and leave the field blank.