
A survey conducted by education law firm Browne Jacobson has found that only 8% of more than 200 school and academy trust leaders feel adequately prepared to effectively implement AI, and only 9% said that they have an AI strategy in place (with 31% saying a strategy was in development).
Furthermore, 75% are worried that they do not have adequate AI expertise in their organisation.
In particular, respondents – who included school leadership, executive headteachers, and academy trust CEOs – are concerned about issues including plagiarism (65%), training (62%), quality control (58%), data security (57%), and safeguarding (44%).
However, despite these findings, 50% of the leaders said they are already using AI tools in their work, including 20% who use it regularly (29% are not using AI at all and 21% use it rarely).
Respondents to the survey, which was run during the autumn term, said that their AI use helps to ease workload, speed-up everyday tasks, and personalise learning.
The most common uses of AI include creating or enhancing resources (44%), managing and reducing workload (36%), summarising content such as emails and reports (36%), personalised learning (19%), and website and social media content generation (16%).
Safe AI? The leadership survey reveals a wide range of AI uses, raising questions about data protection safeguards (source: Browne Jacobson)
Respondents felt that other areas where AI could be of benefit included assistive technologies supporting children with SEN (34%), improving assessment and feedback (26%), and for governance support and policy management (13%).
ChatGPT was the most commonly cited AI tool used by respondents, with other software including Gemini, Microsoft Copilot, Claude, and TeachMateAI.
The report raises concerns that schools and trusts could be breaching data protection rules by using tools not tailored to an education audience.
It states: “It appears that personal data is being processed in AI tools by schools, with personalised learning, assessment and feedback, virtual tutoring, handling parent enquiries and complaints all being listed as uses.
“AI is being used to address recruitment processes as well, which suggests that tools are being used in a way that may pose additional risks to individuals.
“Of the 40% of leaders who use AI, the majority said that they were using readily-available tools such as ChatGPT, Gemini and Copilot, with only a minority using tools that were specifically designed for the UK education sector.
“These tools may be deployed because of their ease of access. However, terms and conditions of these freely available tools may not be suitable for education, with terms including provision for inputs to be used to further train the model, and risks of inaccuracy as they are not trained specifically on UK education information.”
In January, the Department for Education strengthened its guidance for schools on using generative AI (DfE, 2025), but a quarter of the respondents to the survey said the guidance was inadequate.
The report adds: “Education leaders are concerned about the lack of AI expertise and feel unprepared for its implementation, yet optimistic about its potential to improve education by personalising learning and reducing workload.
“Many are not using AI, and those who do often use general tools not tailored for the UK education sector, raising suitability and security concerns. The survey highlights the need for better expertise, tailored AI tools, strategies for governance, risk management and effective AI integration.”
Claire Archibald, legal director specialising in data protection at Browne Jacobson, said: “Leaders tell us they are concerned about data privacy and security, bias and fairness, safeguarding, and quality control. This reveals a contradiction – their actions, such as trying out new technologies without fully understanding the implications, don’t always reflect these concerns.”
Bethany Paliga, senior associate specialising in data protection at Browne Jacobson, added: “We’re urging schools and academy trusts to carefully consider which AI tools are used to ensure they properly consider compliance risks in order to use AI safely and effectively.
“Embracing AI in education is not just about staying ahead technologically; it’s understanding the unique complexities and challenges that come with adopting new technology in a school environment.”
- Browne Jacobson: School Leaders Survey: Sharing insights, emerging opportunities and challenges facing the nation’s schools, February 2025: www.brownejacobson.com/school-leaders-survey
- Browne Jacobson is running a live session, entitled AI and safeguarding: Identifying risks and embracing opportunities, during its EdCon 2025 virtual conference, which runs from March 3 to 28 and is free to attend. Visit www.brownejacobson.com/insights/edcon-2025
- DfE: Policy paper: Generative artificial intelligence (AI) in education, January 2025: www.gov.uk/government/publications/generative-artificial-intelligence-in-education/generative-artificial-intelligence-ai-in-education