Monitoring coral reef health through audio recordings

New research indicates that AI technology can help with monitoring coral reef health through sound recordings.

With the volume of coral reefs continuing to decline due to the threat of climate breakdown, monitoring coral reef health is crucial. Reefs have a complex soundscape – in order to monitor coral reef health, experts have to conduct painstaking analyses based on sound recordings.

The importance of monitoring coral reef health

However, in a new study, published in the journal Ecological Indicators, labelled ‘Enhancing automated analysis of marine soundscapes using machine learning to combine eco acoustic indices’, University of Exeter scientists have discovered a method to make this process of analysis simpler and more accurate.

The scientists have successfully trained a computer algorithm utilising multiple recordings of healthy and degraded reefs, allowing the machine to learn the difference between them. The computer then analysed a host of new recordings, and successfully identified reef health 92% of the time. The team utilised this AI technology to track the progress of reef restoration projects.

“Coral reefs are facing multiple threats, including climate change, so monitoring their health and the success of conservation projects is vital,” commented lead author, Ben Williams. “One major difficulty is that visual and acoustic surveys of reefs usually rely on labour-intensive methods.

“Visual surveys are also limited by the fact that many reef creatures conceal themselves, or are active at night, while the complexity of reef sounds has made it difficult to identify reef health using individual recordings. Our approach to that problem was to use machine learning – to see whether a computer could learn the song of the reef.”

The scientist’s findings indicated that this AI technology can detect coral reef patterns that are undetectable to the human ear, and can give faster, more accurate information while monitoring coral reef health.

Monitoring sound beneath the ocean

The fish and other creatures living on coral reefs make a vast range of sounds. The meaning of many of these calls remains unknown, but the new AI method can distinguish between the overall sounds of healthy and unhealthy reefs.

The audio recordings utilised in the study were taken at the Mars Coral Reef Restoration Project, which is restoring heavily damaged reefs in Indonesia.

Co-author Dr Tim Lamont, from Lancaster University, said the AI method creates major opportunities to improve coral reef monitoring.

The benefits of utilising this AI technology

“This is a really exciting development. Sound recorders and AI could be used around the world to monitor the health of reefs and discover whether attempts to protect and restore them are working,” Dr Lamont concluded.

“In many cases it’s easier and cheaper to deploy an underwater hydrophone on a reef and leave it there than to have expert divers visiting the reef repeatedly to survey it – especially in remote locations.”

This study was funded by the Natural Environment Research Council and the Swiss National Science Foundation.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Featured Topics

Partner News

Advertisements

Media Partners

Similar Articles

More from Innovation News Network