More than 100,000 web pages of children being abused were removed from the internet last year – a third more than in 2017.
The Internet Watch Foundation (IWF), which says its vision is to “eliminate child sexual abuse imagery online”, looks into reports made by members of the public who have stumbled across images they think might be illegal.
It also proactively searches for offending sites.
Of the material taken down last year as a result of the IWF’s work, 1,300 pages showed abuse of infants or babies and more than 40,000 depicted abuse or sexual torture of children under 10.
Some pages contained thousands of pictures and videos, amounting to millions of images in total, with virtually all hosted outside the UK.
The harrowing work of the IWF is carried out by analysts who view and assess every page that might contain abuse.
Their reports are passed to the police and internet companies who then act to close down the sites.
Due to the nature of their work the analysts are understandably guarded about revealing their identities.
“Isobel”, not her real name, has worked for the IWF for six years. She said: “If you ask any analyst, they know it’s a hard job.
“We know these are real victims, it’s a real person that has been affected.
“Something that we do, is that we never listen to the audio on a video unless we have to – it makes it more real.
“Although we want to keep a distance we still realise that behind every image is a child.”
New technology has allowed the analysts to work faster, increasing the number of pages removed.
The “hashing” system creates a unique code or digital ‘fingerprint’ for an individual image which then means copies can be easily traced and removed.
An intelligent web “crawler” that systematically browses the internet makes the scanning of offending sites much faster.
“We’re finding more,” says “Paul”, another analyst. He added: “I believe if we had more analysts we’d find more, it’s everywhere, in all places, not just hidden away in dark corners, we’re becoming far better at finding it.”
Despite the rise in the number of images taken down, it is almost impossible to know if there is an increase in the content is being created.
To try to cut off the flow of material at source the IWF says it will step up its work with internet companies, using its expertise on prevention, rather than just reacting after material has been posted.
But campaigners argue that the tech giants should already be doing more.
Donald Findlater, director of Stop It Now, said: “People viewing sexual images of children online or indeed of grooming of children online, happens through the various platforms that they offer.
“That being the case they must then have the responsibility to proactively do something about that.
“They may not want to be dead visible about that, but the scale of the problem is such that they need to be a bigger part of the solution than they’ve been in the past.”