Big Tech firms should accept responsibility for allowing online child sexual abuse, says UK National Police Chiefs' Council (NPCC) lead for child protection, Chief Constable Simon Bailey.
According to the Norfolk police chief, England's law enforcement agencies have been "overwhelmed" by the number of cases, as he stressed that tech behemoths, such as Facebook and Google, are "not doing enough" to stop online child abuse, including the uploading, sharing, and viewing of indecent images.
"It is undoubtedly, in my mind, one of the greatest, if not the greatest societal threats that we are having to now deal with as a police service. At the moment, law enforcement agencies across the country are becoming overwhelmed with the sheer volume of cases that we are having to deal with, which is absolutely impacting on our ability to deal with the more sophisticated, the more tech-savvy, and potentially the more dangerous offenders, and we are having to deal with an awful lot of lower risk offenders".
"I don't think their role in all this has been truly appreciated because without them the abuse wouldn't be able to take place in so many cases. It's the big market leaders that actually bear responsibility for making sure the internet is a safe place for our children and for our grandchildren to go. And ultimately at this moment in time it's not safe", Bailey said.
The police chief revealed that online grooming has seen a surge during the COVID-induced lockdowns, with young people spending more time on the internet, and paedophiles supposedly exploring more opportunities to abuse children online.
According to Bailey, the eye-popping increase from 7,000 indecent photos of minors in 1990 to 17 million on the child abuse image database now can be attributed to the fast-paced development of the internet and the ability of virtually anyone to take and share pictures from advanced mobile phones.
"The technology is there to prevent the uploading, the sharing, the viewing of images, I think the technology is there to monitor what is taking place within chatrooms, where grooming is taking place. Ultimately, the companies absolutely must bear the responsibility for allowing so much of this abuse to take place and I hold them responsible", he added.
According to Kevin Curran, Professor of Cyber Security at Ulster University, Executive Co-Director of the Legal Innovation Centre, and group leader for the Ambient Intelligence & Virtual Worlds Research Group at the Computer Science Research Institute, the problem of the distribution of child pornography is "a major worry for society".
"Yes, exploitative images exist on social networks such as Facebook but the vast majority resides on the dark web and in other specialised forums on the surface web. Google does not host these images but its role as the leading search engine of course does mean that they have a responsibility to root out these images. The internet giants have claimed that they are working to resolve this problem through a combination of human moderators and algorithms based on artificial intelligence. Microsoft have themselves created an image recognition software which is used by many police forces and internet organisations who attempt to combat child porn online", the professor says.
Curran explains that the difficulty is the sheer size of the internet, "the fact that dark web sites exist which cannot actually be taken down as they are not using the 'centralised web' in addition to paywalls and the multitude of forums in different languages and different jurisdictions makes the problem of eradication very difficult".
"However, we all agree that something needs to be done and the internet giants must be transparent in how they are dealing with the problem. We need to know the size of their teams, budgets and gain some insight into their plans for a safer web", Curran adds.
Meanwhile, a Facebook spokesperson has said that child exploitation and grooming have no place on their platforms.
"Using industry-leading technology, over 99% of child exploitation content we remove from Facebook and Instagram is found and taken down before it's reported to us. We also use a combination of technology and behavioural signals to detect and prevent grooming, or potentially inappropriate interactions between a minor and an adult. We have 35,000 people working in our safety and security team to keep our platforms safe", the spokesperson added.
Last year, the UK's National Crime Agency warned there were at least 300,000 people in the country posing a sexual threat to kids, while the pandemic is believed to have exacerbated the sitation.