
Headteacher Manny Botwe, president of the Association of School and College Leaders (ASCL) has lamented the “darker side” of social media in his address to the union’s annual conference.
He called for social media platforms to be “brought to heel” in a bid to end this “trail of harm”.
It comes as research commissioned by ASCL and published during its annual conference in Liverpool on March 14 and 15 showed just what schools are up against.
The survey involved responses from 6,409 teachers and school leaders in England (2,393 primary and 4,016 secondary) and asked them what social media-related issues they had seen in school since September. Primary school respondents to the survey revealed that:
- Pupils are being bullied by their peers on social media (49%)
- Parents are making negative comments about staff or the school on social media (40%)
- Pupils are accessing pornographic or violent content (18%)
This final finding comes after a report from the Children’s Commissioner for England in 2023 warned that Twitter – as it was known then – had become the most common place that young people see pornography (41%), followed by dedicated pornography sites (37%), Instagram (33%) and then Snapchat (32%). The average age of first exposure was 13, although some pupils were as young as 9.
ASCL’s findings revealed that in 11% of secondary schools, deepfake images or audio are being used maliciously against staff or students, while the secondary respondents also reported problems with students recording teachers or other pupils without permission (46%) and students accessing extremist content (11%). All three of these issues were much less frequent among the primary respondents (1%, 5% and 3% respectively).
Only 14% of schools (including 18% of primaries) said they had no significant issues with social media. Furthermore, 72% of primary respondents reported that pupils were using social media below the minimum age requirements (usually 13-years-old).
Addressing the ASCL conference on Friday (March 14), Mr Botwe – who is the headteacher of Tytherington School in Macclesfield – highlighted the impact of social media and related issues on children and schools.
“Today’s young people face challenges that are vastly different from those of previous generations. Their world is shaped by smartphones, social media, memes, and influencers – forces that shape their identities, interactions, and even their mental wellbeing.”
Mr Botwe acknowledged the benefits that this technology can bring, but added: “As we all know, it has a darker side. It leaves a trail of harm – safeguarding concerns, fractured friendships, bullying, anxiety, and the spread of extremist ideologies. And increasingly, it is being weaponised against schools and teachers, with disgruntled parents using it as a platform to target staff.
“This chaos must end. For too long, tech billionaires have been given immense power without accountability. They hide behind the defence that they are champions of free speech while profiting from platforms that allow harm to fester.
“Enough is enough. It is time to bring these platforms to heel and force them to police their own spaces. As a society, we have the right to demand the protection of our children, the enforcement of decency, and the upholding of standards. That right must be asserted.”
Mr Botwe welcomed the Online Safety Act 2023, which is currently being implemented by regulator Ofcom, but he is waiting to see “how effective (it will) prove in practice”.
New duties under the Act regarding illegal content came into force on March 17 meaning sites and apps must now start tackling criminal content. The Act lists more than 130 “priority offences” – split into 17 categories – including content promoting child sexual abuse, controlling or coercive behaviour, terrorism, and suicide, and tech firms must “assess and mitigate the risk of these occurring on their platforms”.
There is also new age-restriction guidance for industry and companies must prevent children from accessing content that is harmful or age-inappropriate, including pornography, serious violence, bullying, self-harm and eating disorders. If harmful or age-inappropriate content is present on a platform, companies must use age verification tools to prevent children from encountering this type of content.
Ofcom published new guidance in January although tech companies will have a period of grace to comply with the various provisions (see below).
Mr Botwe’s address came after a recent survey of school leaders published by the National Association of Head Teachers reported increasing incidents and severity of parental abuse of school staff, including 46% who said they had been abused online. The school leaders in the survey reported trolling on social media and in parent groups on networks like Facebook and WhatsApp.
- Ofcom: Important dates for Online Safety Act compliance: www.ofcom.org.uk/online-safety/illegal-and-harmful-content/important-dates-for-online-safety-compliance