(ORDO NEWS) — Healthy coral reefs are associated with their visual splendor: the vibrant variety of colors and shapes that fill these beautiful underwater ecosystems.
But they can also be quite noisy places. If you have ever snorkeled in a coral reef environment, you will be familiar with the distinctive clicking and beeping sounds made by various marine life underwater, such as clicking shrimp and feeding fish.
This hum of background noise – almost like the rattling hiss of radio interference – is a unique feature of the coral reef soundscape and can help us monitor the health of these endangered marine habitats.
In the new study, scientists used machine learning to train an algorithm to recognize the subtle acoustic differences between a healthy, living reef and a degraded coral patch – an acoustic contrast so faint that it’s impossible for humans to distinguish.
Compared to other labor-intensive and time-consuming reef health monitoring processes – divers visiting reefs to visually assess coral cover or manually listening to reef records – the new tool could offer significant benefits, the team said. In addition, many reef dwellers hide or are only visible at night, further complicating any visual surveys.
“Our results show that a computer can pick up patterns that are indistinguishable to the human ear,” says marine biologist Ben Williams of the University of Exeter in the UK.
“He can tell us faster and more accurately about the state of the reef.”
To capture coral acoustics, Williams and his colleagues recorded at seven different sites in the Spermond Archipelago, off the southwest coast of Indonesia’s Sulawesi Island, where the Mars Coral Reef Restoration Project is underway.
The records covered four different types of reef habitat – healthy, degraded, mature restored, and recently restored – each of which differed in the amount of coral cover and thus created a different noise pattern from aquatic creatures living and feeding in the area.
“We previously relied on manually listening and annotating these recordings to make reliable comparisons,” explains Williams on Twitter.
“However, this is a very slow process, and the size of marine soundscape databases is growing exponentially with the advent of inexpensive recording devices.”
To automate this process, the team trained a machine learning algorithm to distinguish between different types of coral records. Subsequent testing showed that the AI tool could determine the state of the reef from audio recordings with 92 percent accuracy.
“This is a really exciting development,” says co-author and marine biologist Timothy Lamont of Lancaster University in the UK.
“In many cases, it’s easier and cheaper to install an underwater hydrophone on a reef and leave it there than to have expert divers come in and survey the reef repeatedly – especially in remote locations.”
According to the researchers, the results of the algorithm depend on a combination of factors in the underwater soundscape, including the abundance and variety of fish vocalizations, sounds made by invertebrates, and even possibly faint noises thought to be made by algae, as well as contributions from abiotic sources (e.g. , subtle differences in how waves and wind can sound in different types of coral habitat).
While the human ear may not be able to easily detect such faint and hidden sounds, machines seem to be able to reliably detect these differences, although the researchers acknowledge that the method can still be improved, and a wider sample of sounds is expected to be available in the future. will provide “a more subtle approach to the classification of an eco-state”.
Unfortunately, time is a commodity that runs out quickly for corals. We will have to act quickly if we are to save them.
Contact us: [email protected]