UK

AI action plans should be slowed until safeguards for children in place – NSPCC

The children’s charity has called for statutory safeguards around generative AI to help protect youngsters.

(Dominic Lipinski/PA)

The Government should slow down its artificial intelligence action plans until a statutory duty of care for children is in place around the technology, a leading charity has said.

The NSPCC said generative AI is already being used to create sexual abuse images of children, and urged the Government to consider adopting specific safeguards into legislation to regulate AI.

The charity added that it had also found more than three-quarters of the public (78%) would prefer more robust safety checks on new generative AI tools, even if that meant the launch of such products was delayed.

A new study commissioned by the NSPCC also found that 89% of those asked had some level of concern around AI and its potential safety for children.

The charity said it had been receiving reports from children about AI through Childline since 2019.

Join the Irish News Whatsapp channel

Earlier this month, the Prime Minister announced plans to boost the AI industry in the UK, and to increase its use in daily life, starting with the civil service, as part of wider Government plans to help grow the economy.

The US has also reached announced major investment plans in the technology, which was sparked by the introduction of generative AI chatbot ChatGPT in late 2022, and has since grown into the key innovation area of the tech sector.

Chris Sherwood, the NSPCC’s chief executive, said: “Generative AI is a double-edged sword.

“On the one hand it provides opportunities for innovation, creativity and productivity that young people can benefit from; on the other it is having a devastating and corrosive impact on their lives.

“We can’t continue with the status quo where tech platforms ‘move fast and break things’ instead of prioritising children’s safety.

“For too long, unregulated social media platforms have exposed children to appalling harms that could have been prevented.

“Now the Government must learn from these mistakes, move quickly to put safeguards in place and regulate generative AI, before it spirals out of control and damages more young lives.

“The NSPCC and the majority of the public want tech companies to do the right thing for children and make sure the development of AI doesn’t race ahead of child safety.

“We have the blueprints needed to ensure this technology has children’s wellbeing at its heart, now both Government and tech companies must take the urgent action needed to make generative AI safe for children and young people.”

International conference the AI Action Summit is due to take place in Paris next month.

Derek Ray-Hill, interim chief executive at the Internet Watch Foundation, which seeks out and helps remove child sexual abuse imagery from the internet, said existing laws, as well as future AI legislation, must be made robust enough to ensure children are protected from being exploited by the technology.

“Artificial intelligence is one of the biggest threats facing children online in a generation, and the public is rightly concerned about its impact,” he said.

“While the technology has huge capacity for good, at the moment it is just too easy for criminals to use AI to generate sexually explicit content of children – potentially in limitless numbers, even incorporating imagery of real children. The potential for harm is unimaginable.

“AI companies must prioritise the protection of children and the prevention of AI abuse imagery above any thought of profit. It is vital that models are assessed before they go to market, and rigorous risk mitigation strategies must be in place, with protections built into closed-source models from the outset.

“The upcoming AI Bill is a key opportunity to introduce safeguards for models to prevent the generation of AI-generated child sexual abuse material, and child sexual abuse laws must be updated in line with emerging harms, to prevent AI technology being exploited to create child sexual abuse material.”