Ten-fold increase in child abuse online sparks police calls for tech giants to do 'much more' to block it

TELEMMGLPICT000193348183.jpeg
Facebook

Online child sex abuse images have increased ten-fold as police demand social media giants do “far more” to block them from their sites in the first place.

Senior officers say the explosion in online sex abuse is overwhelming investigators who are “struggling to cope” and distracting them from concentrating resources on identifying and catching “high harm” offenders who are actually sexually assaulting children.

Referrals of online child sex abuse images to police by the tech industry have risen 997 per cent in from 11,477 in 2013 to 113,948 last year, according to data yesterday from the National Crime Agency (NCA). Each referral can include dozens of images. A decade ago, it was just 1,591.

The disclosure came as Apple, one of the world’s richest companies, said it had just six members of staff to police child abuse on its platforms - and spent only £1.5 million on its investigations team. YouTube said it had no humans to look for harmful videos and instead relied on AI.

Lynne Owens, NCA director general, said the technology already existed to design out preventable offending and stop images going online.

This would free up the NCA and police to target and investigate the most serious offenders, many using sophisticated encryption on the dark web to avoid detection.

“What we are proposing to industry is that they have to do so much more about those sites that they have access to because that will then enable us to really focus on the very high harm offenders who are committing contact abuse with children,” said Ms Owens.

“At the same time, we have seen a change in behaviour. To access dark web sites, sometimes you are required to abuse a child. We have real life victims of an offender wanting to access a site who can only do so if they rape a child. They may be six months old, two years old, five years old,” she said.

Ms Owens disclosed earlier this week up to 144,000 British paedophiles are sharing sexual images or abusing children on the dark web, double previous estimates of 80,000.

She used the evidence of the rise in online abuse to support the NCA’s case for an extra £2.7 billion to fight serious and organised crime, including a more than doubling in the agency’s budget to £650 million.

“It brings huge demands for police,” she said. “We aim to get to a place where we can identify who are the very dangerous contact offenders so that we are making the right referrals, the right investigations.”

Simon Bailey, the National Police Chief Councils’ lead on child safety, said the failure of the tech giants to block images is “putting additional burdens on a system that is already struggling to cope.

"It's distracting us and our ability to deal with high harm, high risk offenders. Until such time as the companies prevent the uploading and sharing of images, we are never ever going to come to terms with the scale of the threat that we are now seeing and risk being posed to children on a daily basis."

Off the back of the referrals, he said police were making 400 arrests a month but "the bottom line is that it's a drop in the ocean."

Melissa Polinsky, Apple’s director of investigations and child safety, told the official inquiry into child abuse that of her team of 20 investigators only its dedicated lawyer and five others could assess abuse material.

She also said Apple, which has more than one billion active devices worldwide, had only investigated around a hundred reports of child abuse on its platforms last year.

Barrister Eesvan Krishnan asked Ms Polinsky whether, given Apple's size, “a member of the public may be struck by the fact that there are only six employees worldwide?”

She responded that Apple took a “holistic” approach to child safety and viewed it as “every employee’s responsibility”.

YouTube told MPs yesterday that it employed no humans to look for harmful videos on its site as it was not considered an “effective” method of detection.

Marco Pancini, YouTube director for public policy for Europe, the Middle East and Africa, said artificial intelligence found more than 90 percent of videos the company deleted.

He said: "There is no one that is actively looking for violations, not because we don’t want to have someone doing that, but because it won’t be effective.”

The executive added that the company employed humans to analysis flagged videos to identify harmful trends, although he was unable to say how many were involved in the effort.

The comments come YouTube was heavily criticised after footage of the Christchurch mosque shootings were uploaded more than 20,000 times to the site on the day of the massacre.

Andy Burrows, NSPCC associate head of child safety online, said: “These figures really make clear the immense scale of the problem. It is vital that tech firms take responsibility for child abuse material on their sites and shift the emphasis from removing material to not allowing it to appear in the first place.

“It is crucial that the Government brings in an independent regulator as quickly as possible, so that platforms are forced to design safe accounts for young users and employ techniques to identify groomers – facing tough consequences if children are put at risk on their sites.”

License this content