News

Boris Johnson told: 'Stand-up to social media' to protect our children

The government is being urged to “stand up” to the social media giants and bring forward promised legislation to better protect young people online.

It has been 18 months since the government published its Online Harms White Paper and pledged to make the UK “the safest place in the world to be online”.

England’s children’s commissioner Anne Longfield is frustrated at the delay and has repeated her concerns that technology companies cannot be relied upon to regulate themselves.

Furthermore, in a new report this week, Ms Longfield warns that the introduction of end-to-end encryption messaging by technology giants could make it much harder for platforms to detect grooming, scan for child abuse material and share reports with law enforcement agencies.

It states: “The five most popular messaging services used by children – WhatsApp, Snapchat chats, Instagram DM, Facebook Messenger and Apple iMessage – are already fully end-to-end encrypted by default, have made public their plans to become so in the near future, or have indicated that they are considering the possibility.”

The report, entitled Accessed Denied (Vibert, 2020), looks at children’s use of private messaging services like WhatsApp and Facebook Messenger.

It reveals that nine in 10 children aged eight to 17 are now using a messaging app or website, including seven in 10 children aged eight to 10.

What is more, 60 per cent of eight-year-olds and 90 per cent of 12-year-olds report using a messaging app with an age restriction of 13 or older.

More than one-third of children say that they have received “something that made them feel uncomfortable on a messaging service”. The report also finds that:

  • Almost one in 10 children report using a messaging site to talk to strangers.
  • One in 20 children say they have shared videos or photos of themselves with strangers.
  • One in six girls aged 14 to 17 report having received something distressing from a stranger via a private message.
  • Over a third of eight to 10-year-olds and more than half of 11 to 13-year-olds admit that they have said they were older than they were in order to sign up to an online messaging service.

The report says that the privacy of direct messaging platforms can conceal some of the most serious crimes against children, including grooming, exploitation and the sharing of child sexual abuse material.

It quotes a previous NSPCC investigation (2019) which found that Facebook, Instagram and WhatsApp were used in child abuse images and online child sexual offences an average of 11 times a day in 2019.

The NSPCC has also found that the rate of grooming offences committed in the UK appears to have further accelerated during Covid, with 1,220 offences recorded in the first three months of national lockdown. Facebook-owned apps (Facebook, Instagram, Whatsapp) accounted for 51 per cent of these reports and Snapchat a further 20 per cent (2020).

The Denied Access report calls on the government to step-up its timetable for enacting the Online Harms White Paper. Published in April last year, the White Paper sets out the government’s proposal to create a new regulatory framework for online platforms on which users interact and share material.

Central to this would be a new statutory duty of care, with compliance enforced by an independent regulator. Any online service found to be in breach of their duty of care would face sanctions. A range of online harms would be in scope of the regulation, including both legal and illegal harms.

Prime minister Boris Johnson confirmed his commitment to the proposals in his 2019 manifesto, when he said he would “make the UK the safest place in the world to be online – protecting children from online abuse and harms”.

However, the Denied Access report states: “Efforts to improve children’s experiences online have been undermined by setbacks and delays. It is now over 18 months since the publication of the White Paper, and over three years since the publication of the Internet Safety Strategy Green Paper which preceded it.

“In this time the government has released only its initial response to the White Paper consultation, with a full response promised by the end of this year. It has only committed to bringing forward legislation by the end of this Parliament, which could be as late as 2024.”

Added to this, Ms Longfield is concerned that end-to-end encrypted messaging services could be defined as “private communications” and could therefore not be subject to the duty of care in the same way as other platforms.

The report adds: “End-to-end encryption makes it impossible for the platform itself to read the contents of messages, and risks preventing police and prosecutors from gathering the evidence they need to prosecute perpetrators of child sexual exploitation and abuse.”

The report asks the government to commit to bringing the Online Harms legislation before Parliament in 2021. It says that the new regulatory regime must:

  • Set a strong expectation on platforms to age verify their users (Ms Longfield is frustrated that while most platforms have age limits, children routinely ignore them and the companies “do little to meaningfully enforce” them).
  • Allow strong sanctions for companies which breach the duty of care .
  • Include the full range of services used by children, including end-to-end encrypted messaging services.

Ms Longfield said: “This report reveals the extent to which online messaging is a part of the daily lives of the vast majority of children from the age of eight. It shows how vigilant parents need to be, but also how the tech giants are failing to regulate themselves and so are failing to keep children safe.

“The widespread use of end-to-end encryption could put more children at risk of grooming and exploitation and hamper the efforts of those who want to keep children safe.

“It’s time for the government to show it hasn’t lost its nerve and that it is prepared to stand up to the powerful internet giants, who are such a big part in our children’s lives. Ministers can show they mean business by promising to introduce legislation in 2021 and getting on with the job of protecting children from online harms.”

The report’s author, Simone Vibert, senior policy analyst for the Children’s Commissioner, added: “Hundreds of thousands of children are using messaging apps to contact strangers, including sharing images and photos, and they are receiving images messages back which make them feel uncomfortable.

“The fact that there are age limits on these apps shows that the tech giants themselves are aware of the risks, and yet most do very little, if anything, to reliably check the age of their users. Our research shows a majority of children are using a messaging app which they aren’t old enough to be using. It is yet more evidence of the need for a statutory duty of care on online platforms, including messaging apps.”

The report also includes links to the children’s commissioner’s online toolkits for parents and children, designed to support internet use during Covid-19. Its Digital 5 a Day tool also helps adults to begin a conversation with children about how to achieve a healthy and balanced digital diet.