Blog icon

24 February 2023 2 min read

This blog is an excerpt from episode four of our Everyday AI podcast.

When most people think about birds, they don’t think about artificial intelligence (AI).

Dr Jessie Barry, Program Manager at the Cornell Lab of Ornithology, is not most people.

Jessie leads a project called Merlin Bird ID. If you spot a bird you don’t recognise, you can upload a photo of it to the app. An in-built AI system then helps you figure out what this bird might be based at your current location.

But it’s not only bird photos the team at Cornell University have been collecting over the years. The project is now also home to the world’s oldest and largest collection of animal and bird sounds.

“We have more than a million recordings. That's enough to start training this computer how to identify bird sounds," Jessie said.

“In the last few years, phones have become so good that it’s become easier for more people to participate. Merlin can then help provide answers for what those birds are.”

Not only is AI helping everyday people figure out which bird has been waking them up at 5am each morning, it’s also helping marine biologists with a thorny problem.

Jessie Barry: So we're listening to a recording from the Caper Tree Valley in eastern Australia, New South Wales, and here it's 6:21 in the morning and we're listening to the birds waking up. There's Rufus Whistler. Splendid Fairy Wren. White plumed honeyeater and grey butcherbird sounding off…

Jon Whittle: Jessie Barry is an expert bird watcher. In fact, she was once part of a team that broke the world record for the most bird sightings in a day… 294 different species.

Jessie Barry: It definitely feels like a marathon of birding… we do have to make sure that there's a stash of coffee and water and Gatorade and all those good things to keep us going. But, you know, really the energy that we get from seeing that many birds carries us through

Jon Whittle: Jessie is also an ornithologist at Cornell University. Along with a team of experts, she helps identify birds around the world, based on their physical features… or their sounds. Some of them are much loved backyard birds… you might recognise this one

Jessie Barry: This magpie is absolutely incredible.

Jessie Barry: This Pied Currawong, I just think is super cool. I mean listen to that

Jon Whittle: Others are a little trickier for the amateur birder to identify

Jessie Barry: So here we're listening to a black honeyeater and this is a bird whose range extends across inland Australia, but they're pretty hard to find. They're an eruptive species that moves around and unpredictable ways and this field crew of undergraduates was determined to find one. And then they were crawling along on the ground, parabola outstretched in front of them. And they got this epic recording while carefully approaching the bird. And, you know, this is something that you could spend a lot of time looking for. And they were just thrilled to capture this cut.

Jon Whittle: I must admit, I can tell the difference between a magpie and a kookaburra… but when it comes to all those little flitty things, I get a bit stuck.

But just imagine if the nature lovers and newly converted pandemic era twitchers, could join the record-breaking ranks of birders like Jessie?

Now, with a little help from AI… we can.

Jessie Barry: So Merlin is a free app to help people identify birds, and you don't need to really have any background in birding or bird watching in general to to use Merlin

Jon Whittle: Jessie leads a project called Merlin Bird ID. If you spot a bird that you don’t know, you can upload a photo of it to the app. An in-built AI system then helps you figure out what this bird might be based on your current location.

Jon Whittle: I’m Jon Whittle, an AI expert at CSIRO, Australia’s national science agency. And this is Everyday AI, a podcast series that unpacks just how much artificial intelligence is part of our daily lives.

In this episode, we’re diving into the world of AI for ecosystem rehabilitation, and biodiversity management. And we’ll hear about some exciting projects that are achieving success, by combining the work of experts with everyday people and artificial intelligence.

We’ll come back to Jessie Barry’s app and the Merlin Bird ID app a little later. We’ll also be going to the Great Barrier Reef where computer vision technology is helping scientific divers identify a pesky starfish species. But first, we’re going snorkeling.

Mark McGrouther: My favourite fish... That's a really difficult question! I love some of the deep sea fish, the, you know, big toothy things with photophores, they're light organs. And I love some of the damselfish as well. I also love sharks. I think sharks are amazing, graceful creatures. (fades under Jon)

Jon Whittle: This is Mark McGrouther. He’s an ichthyologist, a fish scientist. He really likes fish.

Mark McGrouther: And I also love some of the seahorses and the pipefish and the sea dragons. They’re incredible.

Jon Whittle: Mark is senior fellow at the Australian Museum, where he worked for 37 years as the fish collection manager.

Mark McGrouther: Through my career as an ichthyologist with the Australian Museum, I went on numerous field trips all over the Pacific and Australia. Dived on the sides of volcanoes and all kinds of amazing places. And what we do is we'd go out and we'd spend, a couple of days or maybe a couple of months collecting fishes, which would then come back and end up in the museum's research collection, and they would then be potentially used by scientists who come to visit or via the loans programme we'd send fish off to people all over the world.

Jon Whittle: Today Mark leads the Australasian fishes project. An online collection of fish photos taken by anglers, scuba divers, snorkelers and beachcombers, of fish that they see underwater, or that have washed up on the beach.

Mark McGrouther: The difference with the Australasian Fishes project is that generally speaking the fish aren't collected, they're just photographed. It's a bit like Facebook for fish. At the moment, we've got over 186,000 observations that have been uploaded by members of the public. We've recorded information on over 3000 species of fishes from Australia and New Zealand.

Jon Whittle: The project sits on a global platform called iNaturalist, that describes itself as ‘an online social network of people sharing biodiversity information to help each other learn about nature’. Thousands of volunteers from all over the world use it to identify different species and for research.

Mark McGrouther: iNaturalist can be used for any form of life. There projects on, you know, bees and trees and ants and you name it.

Jon Whittle: As users upload their photos of birds, bugs, trees, fish, or whatever your organism of choice, teams of experts from around the world, like Mark McGrouther, help you to identify them.

Mark McGrouther: There's a mullet expert that lives in Iran of all places. There are scientists all over the world who work on different sorts of fishes, and we consult them to get expert identifications.

Jon Whittle: It takes an average of 18 days for a photo to be identified by volunteer experts. But all the images that have been uploaded to the platform are being used to train a computer vision model of AI. This system learns the key features of organisms so that it can identify common species in a matter of seconds….

Mark McGrouther: So the iNaturalist AI algorithm, it's very powerful. In fact, the more images go on there, the better it gets. But as a user, you don't need to know any of that. And in fact, I don't know much about how it works myself. It's all magic to me. There's a comment box where you can put in a comment whatever you want to say about this fish record. And there's also the and identification box, and you can click in the identification box, and that's when the AI takes over. You just wait a couple of seconds and it will pop up a list of possible fish that your fish could be. A lot of it's based on how it looks, but also they take geographic locality into effect. So they might say, we're not confident, we think this is but based on colouration and based on where it's found. We think this is your fish.

Jon Whittle: The AI fish identification tool can really lighten the burden of the volunteer experts. But in order for it to properly learn from all these photos, there are a few rules that users need to follow.

Mark McGrouther: One of which is that the fish has to have been observed in Australia or New Zealand. It has to have a date and time, a location. So it's geo located. And also a photograph so that we can actually validate via the photograph. You can then make comments on what you think the fish is or anything about it. Unusual colours, reproductive behaviours, all kinds of things.

Jon Whittle: All this info helps the AI algorithm hone in on the possible species your fish could be. Because there are a couple of issues when it comes to getting a computer to identify fish from images alone.

Mark McGrouther: You can take a photograph of a fish from sideways and that's how the AI will recognise it because you know, it's straightforward, it's this length, it's this shape, it's got this pattern on it. Its mouth is like this, its gills look like this, whatever. But of course fish don't keep still. That's the problem. And you need to have a huge library of images of fish in different directions, and fins sticking out and what have you to identify it.

Jon Whittle: Birds on the other hand, are a little easier to spot. Here’s Jessie Barry again.

Jessie Barry: Relative to other species like insects or mammals, where you just wish you could identify more or you wish you could see them more often, you know, birds are accessible. And so they're really kind of the perfect, you know, taxonomic group to be able to understand the heartbeat of the planet, like, you know, those seasonal differences, those changes in, you know, species distributions through time. All of that is visible through birds.

Jon Whittle: Like the fishes project, the AI scans your bird, and compares it to millions of other images of birds it’s been trained to recognise, to tell you the most likely match.

Jessie Barry: If you open up a field guide, it's going to have, you know, probably all the birds of Australia or another region and you're trying to figure out what's even here. And so Merlin uses data to say which species of birds are likely at a given date and location. And then when you put in some parameters like the size, the colour and what the bird's doing, Merlin can narrow down that list of possibilities.

Jon: This data that Merlin has been trained with, comes from a huge library of photos taken by everyday people of birds in their backyard or that they’ve seen on holiday or wherever… And they’ve been collected over decades through an online project called EBird.

Jessie Barry: EBird started in 2002 and the community has now contributed over 1.3 billion bird sightings from all over the world, which makes it the largest citizen science project.

Jon: But it’s not only bird photos that the team at Cornell University have been collecting over the years… it’s now also home to the world’s oldest and largest collection of animal and bird sounds.

Jessie Barry: We have more than a million recordings, that's enough to start to be able to train Merlin, and train this computer how to identify bird sounds.

Jon: Jessie’s team have been training another AI in the app to learn how to identify different bird songs. Users can upload a recording of wherever they are, and the AI will identify all the birds it can hear in that recording. A Shazam for birds!

So far this incredible AI can identify bird sounds across the US, Canada and Europe, and Jessie tells me they’re currently preparing to release the tech for some species near the tropics. Hopefully we’ll soon be able to use it here in Australia, too.

Jessie Barry: In the last few years, phones have become so good that it doesn’t require special audio equipment anymore. So it’s becoming easier for more people to participate and bird song recording. We've been, you know, absolutely thrilled that the microphones in an iPhone or Android device are really good enough at recording bird songs in a way that Merlin can then help, you know, provide those answers for what the birds are.

Jon: Just like with Mark’s AustralAsia fishes project there are some human experts involved behind the scenes helping the AI to learn from all the data.

Jessie Barry: So for example, you might have a 30 second cut of an American Robin, but in the background there's an American Goldfinch and a yellow warbler and a chestnut sided warbler. And so an expert would need to come in and label all of those birds that are in that cut.

So when you're out in the field with Merlin and it's running, it's kind of collected all the knowledge from being able to have access to a million recordings and all this expert information on exactly what's happening. And from there, that's the kind of magic of machine learning that outcome's a model that's able to make those predictions for you as a user in the field.

Jon Whittle: Once the AI is trained, it certainly takes a load off for the human identifiers on Jessie’s team…but will there come a time where machine learning AIs will surpass these expert bird whisperers and listeners?

Jessie Barry: Merlin is detecting everything within the range of that microphone. So an experienced birder is going to hear a little more because, you know, the range of what their hearing is going beyond what our phones can listen to. But as far as, you know, identifying that bird correctly, Merlin is starting to catch up to the experts.

Jon Whittle: Jessie tells me that this AI sound identification technology is starting to be applied to other species and ecosystems too.

Jessie Barry: It's particularly active in marine ecosystems where you can, you know, detect and identify whales that are moving through perhaps areas where there's a lot of shipping activity and through, you know, automated recognition of whale vocalisations and real time detection, there's actually the opportunity there to prevent ship strikes, a leading cause of whale mortality in some regions. We can also be monitoring fishes and amphibians and a whole suite of species, and it's really about finding enough data to to train the computer what to expect. And then there's the actual processing of all those hundreds of hours of recordings, which is still a major barrier because it's just hard to manage all that those terabytes of data and get in the right place. But I'm really hopeful that in the years to come, we're going to see this technology be applied to many conservation questions across the globe.


Jon Whittle: This brings us nicely back under the ocean - and to one of my favourite AI projects at CSIRO.

Megha Malpani: I have to say, I think that just the reef structure and diversity and just sheer number of megafauna on the Great Barrier Reef is hard to find anywhere else… Like you cannot step into the water without seeing, you know, many blacktip sharks and eagle rays and guitar fish and turtles…

Jon Whittle: This is Megha Malpani. She oversees machine learning projects at Google.

Megha Malpani: I double majored in computer science and marine biology straight out of graduating school, I joined Google. And so this project was actually really exciting for me because it allowed me to combine my two passions.

Jon Whittle: The project Megha is referring to took place on the Great Barrier Reef, the longest continuous stretch of reef on the planet. There are a number of issues putting this incredible ecosystem at risk, and one of them is the Crown of Thorns Starfish.

This is not a cute, cuddly starfish. It’s covered in poisonous spines and it eats coral. And in recent years, their populations have been growing out of control.

Traditionally, monitoring Crown of Thorns populations has been a drag. Literally. Divers are towed behind a boat and count all the Crown of Thorns starfish they see. You can only tow divers at a certain speed, stopping every two mins to record what they see. They’ll do this until they’ve surveyed the whole reef, which can take all day, or longer.

Megha Malpani: This is problematic for a lot of reasons. One being there's only so much ground a human can cover. Also, their focus is often much more narrow than like a broader camera. And I think the most important one is that even with experts, their ability to actually spot the crown of thorns is low because they like camouflage themselves and they hide themselves.

Jon Whittle: CSIRO teamed up with Megha’s team at Google to create an object-detection AI that can map out the starfish in real time. It was a close collaboration using CSIRO’s computer vision, and Google’s machine learning technology.

Megha Malpani: Basically we worked with the CSIRO to build a machine learning model that can detect crown of thorns starfish. And so we are deploying it on the edge, which basically means it's operating in real time on the boat and processing video in real time to figure out where there are crown of thorns starfish. And the idea here is that we can use it to map out the crown of thorns starfish and then help control teams figure out where the outbreaks are and prioritise them.

Jon Whittle: When compared to divers being towed behind the boat, the AI technology had a massive impact.

Megha Malpani: We did this sort of AB tests where we had the machine learning model and we had an expert kind of go out and try to detect crown of thorns starfish and the expert found one while the machine learning model picked up 20. And so that just gives you a sense of how difficult this task is, where even the people that are professionals and trained in detecting crown of thorns starfish have a hard time doing so to 100% accuracy

Jon Whittle: The really exciting part for me is that we ensured this technology can scale beyond the Great Barrier Reef. The model has been made open-source, so that students, researchers and data scientists can build on it for reef and environmental conservation projects across the world.

I had a look on the iNaturalist platform for Crown of Thorns Starfish. There are over 700 sightings of these spiky critters, observed and uploaded from citizen scientists around the world.

It’s citizen science projects like this that help ecologists, researchers and ichthyologists like Mark McGrouther understand the bigger picture of the environments they work in.

Mark McGrouther: The way I look at it is you've got thousands of eyes around Australia and New Zealand that can potentially upload information about fishes. There aren't enough scientists in the professional ichthyologist community to get out there and do that all the time. That's the advantage of having people on the ground or in the water, in this case, taking photographs of fishes. We get so much better coverage.

There's information we're collecting on parasites, fish parasites, fungal diseases, injuries to fish, the fish predators. Including one amazing observation I saw where a spider was eating a fish, which is just extraordinary. We've got information on mass fish kills, including sunfish, slender sunfish on the Western Australian coast. They wash up in huge numbers. We're collecting data on a bunch of new species, which is pretty exciting.

Jon Whittle: Not only is the AI helping beach-lovers identify their fish faster. All that data that is being amassed by the Australasian Fishes Project can also be applied directly to scientific research.

Mark McGrouther: We found over 140 range extensions… There are a whole bunch of fish that we've recorded way south of their normal recognised distribution and we do attribute that to warming waters because of climate change.

There's something called a white patch damsel, which is now known to occur seven or 800 kilometres further down the Queensland coast than we previously knew. We've been recently doing a little work on Sydney Harbour and there are numerous records. In fact I think there are about 18. If I had to put a number on 18 records of species that through our project we've identified as occurring in Sydney Harbour that we didn't know previously, which is pretty amazing. One of those is a clown Toby Cowtail Stingray and White Spotted Dragonet. All of which were previously unknown from the harbour before.

Mark McGrouther: Why this all matters? It's hard to say. I mean, you could argue, quite justifiably, that we know enough already to know that we're stuffing the place up and that we don't need to know any more. We, in fact, need to make better environmental decisions. But of course, knowing what you've got better enables you to know how to manage it. And if you've got a whole bunch of different species that you don't know what they are, you don't know anything about them, you don't know where they reproduce, where they're young live, where their nursery grounds are. You could be damaging the resources that are there and through ignorance. So it's great to have thousands of people providing information to the scientists that they can then use in their research.

Jon Whittle: The team behind the Merlin bird ID app has given bird researchers some pretty interesting insights too.

Jessie Barry: Let's take a species like that, Hudsonian Godwit. And this is a bird that moves from its wintering grounds in southern Chile all the way up to the Canadian Arctic and Alaska to breed. And so this is a, you know, really long migration and that godwit has to time when it arrives in the Arctic and exactly when the snow is melting. And they also want to have their chicks hatching at the same time that there's lots of insects so they can grow in just a matter of weeks, be ready to take that incredible migration all the way down to Chile. And so when we can study, you know, when birds are arriving on their breeding grounds, are there chicks hatching at the right time? All these things need to be in sync. And birds are kind of a way to see if that's out of sync. You'd have a case where the chicks aren't going to be able to find enough food, and then that nesting success for that year becomes really low. And over time that could really, you know, lead to declines in that population. So, birds are really these indicators into things that are much less visible in the environment.

Jessie Barry: I have a deep love for birds and a passion for sharing them with others, because I think it's this way to just reconnect all of us to the natural world. You know, there's so much pressure on birds and wildlife right now with threats for their habitat. And, you know, we're in a situation where we're really sharing space with biodiversity. And I think birds are a way that just helps us all, you know, acknowledge that we've got to figure out how to live side by side. And it's going to take us really being, you know, stewards of this environment to create a system where we can all be part of it.

Jon Whittle: Hey, I gotta admit it - I'm probably the least likely person to get excited by birdwatching. But my partner's teenage daughter is really into it. She loves nothing more than going to the park and seeing how many birds she can spot. If an AI birdwatcher can inspire that teenage girl to get interested in tech too, well, that's a win in my book.


Jon Whittle: The Merlin Bird ID app is available for free on your app store. If you’d like to contribute bird photos and recordings to Cornell university’s bird library you can upload them at ebird.org. They’ve also got sound recording workshops and online resources for becoming field recording experts.

And at inaturalist.org you can search for citizen science projects that you can contribute to for plant, animal or insect species. This is where you can find the Australasian fishes project if you’re a snorkel enthusiast.

In our next episode, we’re taking AI to the tennis court… and the hockey field, and the crowds in the stands to learn how artificial intelligence is revolutionising sport.

Stuart Morgan: 8 or 10 cameras on each court, essentially communicating with one another and tracking the ball as well as the player 50 times a second. And then what happens is, by virtue of the way that the cameras are able to communicate with one another you’re able to provide a prediction or estimate whether the ball is in or out on the tennis court.

Jon Whittle: I’m Jon Whittle and Everyday AI is a CSIRO series created by me and Eliza Keck. Alexandra Persley is our supervisor producer. And Jess Hamilton is senior producer from Audiocraft. The Audiocraft production team is Jasmine Mee Lee, Cassandra Steeth and Laura Brierley Newton. We’d love to know what you think, so please subscribe to Everyday AI and leave us a review wherever you get your podcasts.

Share & embed this video

Link

https://www.youtube.com/embed/c_KPae5zciI

Copied!

Embed code

<iframe src="https://www.youtube-nocookie.com/embed/c_KPae5zciI" width="640" height="360" frameborder="0" allow="autoplay; fullscreen" allowfullscreen></iframe>

Copied!

Using AI in conservation

Crown-of-thorns starfish are not cute, little starfish. They are covered in poisonous spines and eat coral. And on the Great Barrier Reef, their populations have been growing out of control.

Traditionally, marine biologists will monitor population numbers by getting dragged through the water behind a boat and counting all the spiny species as they go. You can only tow divers at a certain speed, stopping every two minutes to record what they see. They’ll do this until they’ve surveyed the whole reef, which can take all day, or longer.

We knew AI could help improve this process. So, we teamed up with Megha Malpani and her team at Google to build a machine learning model that could detect the plethora of poisonous problems.

Google’s Megha Malpani was part of the team behind a computer-vision project to assist in monitoring crown-of-thorns starfish.

“It's operating in real time on the boat and processing video in real time to figure out where there are crown-of-thorns starfish," Megha explained.

"We can use it to map out the crown-of-thorns starfish and then help control teams figure out where the outbreaks are and prioritise them.”

The new system has been extremely effective.

“In a test, the human expert found one crown-of-thorns starfish, while our machine learning model picked up 20!" Megha said.

We made sure this technology can scale beyond the Great Barrier Reef. The model has been made open source so that students, researchers and data scientists can build on it for reef and environmental conservation projects across the world.

Contact us

Find out how we can help you and your business. Get in touch using the form below and our experts will get in contact soon!

CSIRO will handle your personal information in accordance with the Privacy Act 1988 (Cth) and our Privacy Policy.


First name must be filled in

Surname must be filled in

I am representing *

Please choose an option

Please provide a subject for the enquriy

0 / 100

We'll need to know what you want to contact us about so we can give you an answer

0 / 1900

You shouldn't be able to see this field. Please try again and leave the field blank.