DEMAND AN END TO POVERTY THIS GENERAL ELECTION
TAKE ACTION
News

AI in schools: Bullies using AI to create sexually explicit images of classmates, parents warned

AI is driving 'massive issues with exclusion', concerned parents have said – while experts warn it could worsen child sexual exploitation

AI poses new threats to children. Credit: canva

School bullies have a new weapon: artificial intelligence (AI).

Bullying, sadly, is nothing new. But the rise of AI is driving “massive issues with exclusion”, concerned parents have said – while experts warn it could worsen child sexual exploitation.

Some 59% of pupils aged between seven and 17, and 79% of 13 to 17-year-old internet users in the UK, have used a generative AI tool in the last year.

This technology has many positive uses, from homework help to creativity prompts. But it has a dark underbelly.

Kate – a parent who did not want her last name used – told the Big Issue that her 10 year-old daughter came home crying after classmates fed her photo into an AI “looks rater”. 

“On the site you can submit a picture of yourself or anyone else and it’ll use AI to rate every aspect of your appearance out of 10 while going into detail on your faults. It’s horrible,” she said.

“If you thought Instagram was doing damage to our kid’s self-image and self-esteem you should see the effect this has. My daughter was crying because some of the boys in her class put her on the app and shared her score. It’s just awful… the number of issues with bullying have sky-rocketed.”

Sadly, Kate’s story is the tip of the iceberg. As AI tools become easier to use, they create new avenues not only for bullying, but for sexual exploitation.

In November, the UK Safer Internet Centre (UKSIC) revealed that it has begun receiving reports from schools that children are using AI image generators to create indecent images of each other. 

Such images – which are legally classified as child sexual abuse material – could create an “unprecedented safeguarding issue,” the centre warned. A child may generate an image to taunt a classmate, but it could then spread out of their hands and end up on dedicated abuse sites.

“Young people are not always aware of the seriousness of what they are doing, yet these types of harmful behaviours should be anticipated when new technologies, like AI generators, become more accessible to the public,” said David Wright, director at UKSIC.

“Although the case numbers are currently small, we are in the foothills and need to see steps being taken now, before schools become overwhelmed and the problem grows.”

Photoshopping fake images is nothing new. But advances in generative AI means the images and videos are “more realistic than ever and easier to use”, said Dr Andrew Rogoyski, from the Surrey Institute for People-Centred Artificial Intelligence. “So, misuse of [generative] AI to generate deepfakes is likely to increase in the near-term.”

AI companies have guardrails to prevent misuse, Dr Rogoyski added – but their systems “aren’t perfect”.

“A lot of effort is being expended to improve these safeguards, partly because companies will be held to account and partly because there are reputations risks for companies that allow their systems to be misused,” he explained.

What can we do about bullying and AI abuse in schools?

As AI companies scramble to improve internal safeguards, advocates are calling for schools and government to take action

The Safer Internet Centre urged schools to update monitoring systems to block illegal material on school devices. More broadly, they want to see government implement “more regulatory oversight of AI models.”

“We must see measures put in place to prevent the abuse of this technology,” said Emma Hardy, director at the Safer Internet Centre. “Right now, unchecked, unregulated AI is making children less safe.”

The Anti-Bullying Alliance echoed this call, calling on the government to compel companies to consider children before they progress new tech.

“AI and deep fakes present new challenges, but also opportunities for proactive solutions,” said Martha Evans, director of the Anti-Bullying Alliance.

“We urge the government through Ofcom to embed robust children’s safety Code of Practice in wake of the Online Safety Act, forcing companies to consider children’s safety in developing new technologies.”

Unfortunately, the problem goes beyond the AI companies themselves. The pictures may be generated using AI, but they are spread by older internet technology that has existed for decades now.

“There is a continuing argument about whether social media companies are responsible for what appears on their platforms, and should be treated as a publisher, or whether they are just a platform. Undoubtedly tech companies could and should do more to prevent sharing of abusive images,” said Rogoyoski.

The nature of the internet makes offences like these hard to detect and prosecute. And broader still is the problem of attitudes in schools, Rogoyoski added.

“Is the problem the digital tools or social attitudes and behaviours? Why are young people in schools using such tools for such purposes?” he asked.

“Is it simply ease of use or is there a more concerning erosion of knowing right from wrong?”

Support the Big Issue

For over 30 years, the Big Issue has been committed to ending poverty in the UK. In 2024, our work is needed more than ever. Find out how you can support the Big Issue today.
Vendor martin Hawes

Recommended for you

View all
'Great pity' major parties are ignoring poverty in election, says Alastair Campbell and Rory Stewart
Rory Stewart and Alastair Campbell
General election 2024

'Great pity' major parties are ignoring poverty in election, says Alastair Campbell and Rory Stewart

People with learning disabilities need to make their voices heard on election day
Ismail Kaji of Mencap
Learning Disability Week 2024

People with learning disabilities need to make their voices heard on election day

'I want to get out': The fight for people with learning disabilities to live at home – not hospitals
Learning Disability Week 2024

'I want to get out': The fight for people with learning disabilities to live at home – not hospitals

Labour unveils plan to fix rental crisis and immediately axe no-fault evictions
Labour will ban no-fault evictions, confirms Angela Rayner and Keir Starmer
RENTING

Labour unveils plan to fix rental crisis and immediately axe no-fault evictions

Most Popular

Read All
Renters pay their landlords' buy-to-let mortgages, so they should get a share of the profits
Renters: A mortgage lender's window advertising buy-to-let products
1.

Renters pay their landlords' buy-to-let mortgages, so they should get a share of the profits

Exclusive: Disabled people are 'set up to fail' by the DWP in target-driven disability benefits system, whistleblowers reveal
Pound coins on a piece of paper with disability living allowancve
2.

Exclusive: Disabled people are 'set up to fail' by the DWP in target-driven disability benefits system, whistleblowers reveal

Cost of living payment 2024: Where to get help now the scheme is over
next dwp cost of living payment 2023
3.

Cost of living payment 2024: Where to get help now the scheme is over

Strike dates 2023: From train drivers to NHS doctors, here are the dates to know
4.

Strike dates 2023: From train drivers to NHS doctors, here are the dates to know