Indonesia - Artificial intelligence trained to study coral reef "song"

An artificial intelligence has been programmed by Exeter University researchers to monitor the health of coral reefs by studying their sounds.

No items found.
  • New research shows that artificial intelligence (AI) can track the health of coral reefs by analysing the sounds and noises emitted by its constitutive parts.
  • Exeter University used the technology to monitor the progress of coral restoration.
  • The series of recordings successfully determined the health status of the reef 92 per cent of the time.

What sound does a coral reef make? The University of Exeter, in the UK, decided to find out for scientific purposes. A team of researchers created an artificial intelligence program that can determine the health of coral reefs by listening to the soundsthat they make. These living organisms, made up of colonies of polyps and found on the bed of seas and oceans, produce complex sounds and noises due to the passage of fish and other animals. Analysing these special songs through AI allows researchers to obtain data useful in measuring the health of corals and launch restoration projects when necessary.

Corals’ favourite songs

In the study led by professor Ben Williams, an algorithm was trained using a vast database of sounds from both healthy and degraded coral reefs, allowing the machine to learn the difference. The artificial intelligence was then used to analyse new recordings and it was able to successfully determine the health of coral reefs 92 per cent of the time. “Coral reefs are facing multiple threats including climate change, so monitoring their health and the success of conservation projects is vital,” said Williams.

Until now, one of the greatest difficulties has come from the fact that visual and acoustic analyses of coral reefs were based on highly labour-intensive methods that only specially trained scientists could take part in. “Visual surveys are also limited by the fact that many reef creatures conceal themselves, or are active at night, while the complexity of reef sounds has made it difficult to identify reef health using individual recordings,” Williams explained. The Exeter team chose a technological approach inspired by machine learning to create a programme that could recognise a healthy coral reef’s song. “Our findings show that a computer can pick up patterns that are undetectable to the human ear. It can tell us faster, and more accurately, how the reef is doing.”


Read also Original Article

Enhancing automated analysis of marine soundscapes using ecoacoustic indices and machine learning, Science Direct

Reefs are in trouble. Can scientists nurture more resilient coral? The Christian Science Monitor / January 23, 2023

Evaluation of the current understanding of the impact of climate change on coral physiology after three decades of experimental research

Ocean acidification by saflickinger / January 24, 2023


The role of coral reefs and how to avoid degradation

The recordings used in the study were made in Indonesia, where some of the local reefs are severely damaged. The study’s authors state that the AI method provides great opportunities to improve the monitoring of these important organisms. “This is a really exciting development. Sound recorders and AI could be used around the world to monitor the health of reefs, and discover whether attempts to protect and restore them are working,” said Dr Tim Lamont, co-author of the study. “In many cases, it’s easier and cheaper to deploy an underwater hydrophone on a reef and leave it there than to have expert divers visiting the reef repeatedly to survey it – especially in remote locations.”

Read more.