Last updated May 20, 2019 at 11:25 am
A Melbourne company has developed AI that turns the tables on illegal hunters – identifying them faster from automatic cameras and allowing ground forces to swoop in.

An image captured from one of the WPS field cameras (left) and the AI-identified person (right). Credit: Silverpond
Looking at the picture above, most people would probably notice the dogs in front and give no thought to the ghostly smudge in the centre. But that’s exactly why Wildlife Protection Solutions (WPS) has turned to AI technology to take a harder look.
That smudge you didn’t recognise was an illegal poacher.
While many conservation efforts have been made to crackdown on illegal wildlife poachers, they continue to evade detection.
The US-based organisation has commissioned Melbourne AI specialists Silverpond to develop an algorithm that can quickly and accurately identify people and vehicles that might be involved with poaching in conservation reserves in Africa, Asia and the Pacific.
And the technology is working. Prior to the implementation of the cloud-based, machine learning technology called Silverbrane, WPS had only a 40% detection rate. It’s now 70-80%, with expectations this will rise as the system is fine-tuned and its potential is maximised.
Fine-tuning for murky images
The key to success was developing customised technology that could work with often pretty murky images.
WPS already had motion-detection cameras in 21 locations covering 1,853,320 hectares of protected land, but couldn’t find an AI package that could effectively assess, sort and prioritise a million or so separate inferences or images in a year. As a result, there were many false alarms.
“Off-the-shelf solutions are trained using good quality images that aren’t blurred, or taken at low light or different angles, which is why they struggle dealing with things like that,” says Silverpond’s Susie Sheldrick.
“The people in WPS’s images are often shadowy and look like a ghost. There are not many times you would want a person detected in an image like that, but that’s exactly what they needed.”
“For AI to be really effective the data source is vital,” says Sheldrick. “To train the model what to look for it needs to be shown.”
So rather than relying on training the AI on images that showed people clearly or in well-lit situations, the Australian developers called upon WPS’s library of millions of labelled images captured of poachers in the field taken in poor light.

Hope the Rhino following surgery to close a wound left by poachers who hacked her horn off. Credit Moeletsi Mabe/The Times/Gallo Images/Getty Images
Humans confirm hits and notify on-the-ground forces
The resulting system can quickly dismiss pictures of animals (though these can have value in helping track animal movements), only passing on for human scrutiny those that show something out of place – which may be about time of day as much as place.
Trained WPS staff are able to explain or justify some irregularities, but in other cases contact rangers on the ground.
Two groups of poachers were detected and apprehended in the first week of its use.
What’s exciting, says Sheldrick, is the model’s ability to continually improve. The more images it is trained to recognise, the higher the detection rates are likely to be.
More broadly, she says, it’s a good way of showing people what AI can do.
“Wildlife conservation is miles away from where people think AI can be deployed. This isn’t about robots. It’s using artificial intelligence to build on what humans can do.”
Susie Sheldrick and the Silverbrane project appear in the SBS podcast The Few Who Do.