UK

‘Nothing stopping’ spread of child abuse images on WhatsApp, says safety group

The Internet Watch Foundation (IWF) has accused tech giant Meta of failing to have the mechanisms in place to stop the spread of such material.

The Internet Watch Foundation has accused Meta of failing to have the mechanisms in place to stop the spread of child abuse material on WhatsApp
The Internet Watch Foundation has accused Meta of failing to have the mechanisms in place to stop the spread of child abuse material on WhatsApp (Alamy Stock Photo)

There is “nothing stopping” child sexual abuse imagery spreading on WhatsApp, a child safety organisation has said as it called on Meta to do more to protect children on its encrypted messaging platform.

The Internet Watch Foundation (IWF) has accused the tech giant of failing to have the mechanisms in place to stop the spread of such material, including the exact content that was sent to disgraced former BBC broadcaster Huw Edwards.

In July, Edwards admitted having indecent imagery of children, which were shared with him on WhatsApp, the Meta-owned end-to-end encrypted messaging platform. No one outside a conversation is able to see or access encrypted messages, including the service provider itself.

Some campaigners are pushing for changes to encryption that would give law enforcement agencies the ability to access encrypted messages as part of efforts to fight the spread of child abuse material.

However, others have argued that secure messaging platforms are vital to protect vulnerable people, including younger users, and that there is currently no viable technology that could create a so-called backdoor into encryption without ultimately breaking encryption in general and impacting user privacy and safety as a result.

Join the Irish News Whatsapp channel

Dan Sexton, chief technology officer at the IWF, an organisation which helps detect and remove child abuse content from the web, accused Meta of “choosing not to” ensure such indecent imagery cannot spread with its approach and in the wake of the Edwards case.

“I’d like to ask this question; how is Meta going to prevent this from happening again? What is stopping those images being shared again on that service today, tomorrow, and the next day?” he said.

“Right now, there is nothing stopping those exact images and videos of those children being shared on that platform, even though we know about it, and they know about it, and the police know about it. The mechanisms are not there. That’s what I’d like to see changed.

“There are tried, trusted and effective methods to detect images and videos of child sexual abuse and prevent them from being shared in the first place.

“But in WhatsApp, these safeguards are effectively switched off, with no alternative measures in place.

“We must not forget children are at the heart of this scandal, and everyone, including big internet companies and platforms, owe it to those victims to make sure their imagery cannot spread even further. At the moment, Meta is choosing not to.”

The IWF has been backed by other child safety groups, as well as the National Crime Agency (NCA) and safeguarding minister Jess Phillips.

She said: “Child sexual abuse is a vile crime that inflicts long-lasting trauma on victims.

“UK law is crystal clear – the creation, possession and distribution of child sexual abuse images is illegal and we continue to invest in law enforcement agencies to support their efforts in identifying offenders and safeguarding children.

“Technology exists to detect and prevent the abhorrent abuse of thousands of children and ensure victims are given privacy by stopping the repeated sharing and viewing of images and videos of their abuse.

“Social media companies must act and implement robust detection measures to prevent their platforms being safe spaces for criminals.”

Rick Jones, acting director of intelligence at the NCA said it was “fundamentally not acceptable” for tech companies to “consciously step away from preventing the distribution of indecent images of children” on their platforms.

“Technology is available to identify these images, but most companies are choosing to design their platforms in a way that does not allow it to be used either at all, or to its full effectiveness,” he said.

“When end-to-end encryption (E2EE) is used, technology companies cannot protect their customers, millions of whom are children, as they simply cannot see illegal behaviour on their own systems.

“This is not a UK-only issue. In a statement issued in April this year, the NCA and 32 police chiefs from across Europe called on technology companies, such as Meta, to do more to ensure public safety measures are in place across their platforms.

“It is not morally defensible for platforms to put the onus on victims, especially children, to identify and report abuse that they are being subjected to.”

In response, a WhatsApp spokesperson said: “End-to-end encryption is one of the most important technologies to keep everyone safe online, including young people.

“We know people, including journalists, activists and politicians, don’t want us reading their private messages so we have developed robust safety measures to prevent, detect and combat abuse while maintaining online security.

“This includes the ability to report directly to WhatsApp so we can ban any user who shares this heinous material and report them to NCMEC (National Centre for Missing and Exploited Children). Other messaging apps don’t have the safety measures we have developed.”