The Zoological Society of London (ZSL) has revealed details of how its ongoing partnership with Google Cloud is supporting its efforts to protect endangered animal species from poaches using machine learning and artificial intelligence tools.
For the last three years, the international conservation charity has been working with the Google Cloud team to co-build custom machine learning models to help identify and track endangered species across the world.
However, the next phase of its work has seen the pair use acoustic sensors to record gunshot noises and combine this with the cloud giant’s data warehousing and machine learning tools to analyse the threat poachers pose to endangered species in a set area.
According to estimates from the World Wide Fund for Nature (WWF), wildlife poaching generates about $20bn a year for participants, and is linked to the decimation of endangered species populations around the globe.
To tackle this, ZSL began work in 2018 to deploy 69 acoustic recording sensors in a nature reserve in Cameroon, central Africa, to collect and analyse audio data to gauge how useful this would be in monitoring illegal hunting activity in this area.
The experiment resulted in a huge dump of data being collected, which the organisation worked with Google to find an efficient way of sorting through to identify and analyse any instances of gunshots occurring during the recording.
“Over the course of a month, ZSL’s acoustic devices captured 267 days’ worth of continuous audio totalling 350GB,” wrote Omer Mahmood, head of customer engineering for the consumer packaged goods and travel industry covering UK and Ireland at Google, in a blog post. “Even one month’s worth of data would be too labour intensive for a human to listen to and analyse manually.”
To address this, ZSL deployed a pre-trained, open source, machine learning model called YAMNet, which was originally designed in-house at Google and trained on millions of YouTube videos.
“YAMNet was used to recognise sound events in ZSL’s dataset, stored in Google Cloud Storage. The initial classification of 350GB worth of data took less than 15 minutes to complete and identified 1,746 instances with a high confidence of being gunshots,” said Mahmood.
The data collected was organised using Google’s fully-managed, data warehousing tool BigQuery where details of the device used to record the suspected gunshot was logged, along with its location and time.
“This allowed ZSL to quickly query and focus on only the audio files with the highest probability of containing a gunshot sound from thousands of hours of recording, for further analysis,” said Mahmood.
From here, any audio clips containing suspected gunshot noises needed to be manually listened to and visually inspected as spectrograms to be confirmed as gunshots.
By cross referencing this data and making use of BigQuery’s geospatial capabilities and machine learning modelling techniques, this makes it possible for the ZSL team to pinpoint locations where extra law enforcement monitoring for signs of poaching activity could be needed.
“In this short one-month study that only covered a portion of the reserve, the research team were able to contribute new insights to the human threats to species,” said Mahmood. “Past data suggests gunshots are more likely to take place at night to evade ranger detection, but using ecoacoustics alone, ZSL provided evidence of illegal hunting occurring during the day.”
Looking ahead, ZSL said it plans to use the findings gleaned from this work to inform development of real-time alerts about suspected poaching activity in areas its monitoring.
“With animal populations under enormous pressure, technology and in particular machine learning, has huge potential for enabling conservation groups like ZSL to deploy their resources more efficiently in the battle against illegal wildlife trade,” concluded Mahmood.