The education sector takes first steps in AI Regulation

AI is growing at an exponential rate and AI enabled tools are now widely available in schools. For education providers, this presents both new opportunities and risks.

In February 2024, the Department for Science, Innovation and Technology (DSIT) and the Department for Education (DfE) asked a number of regulators to publish their strategic approach to AI and the steps they were taking in line with the UK AI Regulation White Paper (see our previous blog for more information). The Office for Standard in Education, Children’s Services and Skills (Ofsted) and Office of Qualifications and Examinations Regulation (Ofqual) both published their approaches in regulating AI on 24 April 2024.

Ofsted’s approach

Ofsted currently use AI in their risk assessment of good schools but have recognised that AI tools can also help education providers make better-informed decisions, reduce workload and lead to innovative ways of working.

Ofsted supports the use of AI where it improves the care and education of learners and will assess the impact of AI on children and learners as part of its inspection and regulatory processes, but it will not, however, actively inspect the quality of the AI tools used by education providers.

Ofsted’s approach to regulation will follow the 5 principles set out by the UK AI Regulation White Paper as follows:

  • Safety, security and robustness – they will make sure AI solutions are safe and secure for users and that they protect users’ data. They will continually test AI solutions to identify and rectify bias and error.
  • Appropriate transparency and explainability – they will be transparent about their use of AI and will make sure they test solutions sufficiently to understand the decisions it makes.
  • Fairness – they will only use AI solutions that are ethically appropriate. In particular, they will fully consider any bias relating to small groups and protected characteristics at the development stage and then monitor it closely and correct it where appropriate.
  • Accountability and governance – they will provide clear guidance and rules for developers and users of AI within Ofsted about their responsibilities.
  • Contestability and redress – they will make sure that staff are empowered to correct and overrule AI suggestions and will ensure decisions are made by the user and not the technology.

Additionally, they will continue to manage concerns and complaints through their existing complaints procedure.
Ofsted have confirmed that due to the pace of change of the technology, they will ensure they keep up to date with relevant published research and will continue to communicate with providers so that they can better understand AI and regulation.

Ofqual approach

Similar to Ofsted, Ofqual’s approach to regulating AI in the qualification sector sets out 5 key objectives which mirrors the five key principles of the UK AI Regulation White Paper. The objectives aim to support the design, development and delivery of high-quality assessments, while identifying the risks in using AI in non-exam assessments.

The objectives are as follows:

  • Ensuring fairness for students – to ensure AI does not lead to unfair outcomes and/or lack of clarity over what constitutes malpractice.
  • Maintaining validity of qualifications – recognising and managing potential threats to validity and identifying and acting on activities more susceptible to being adversely affected by AI.
  • Protecting security – being alert to malpractice, including protecting student data.
  • Maintaining public confidence – ensuring steps are taken to maintain public confidence around the use and effects of AI.
  • Enabling innovation – using AI within design, development and delivery of qualifications.

Ofqual aims to ensure that AI is used by awarding organisations in a manner that is safe and appropriate and does not threaten the fairness and standards of qualifications.

The future for Scotland

Both Ofsted and Ofqual are regulatory bodies in England and so do not directly affect the education providers in Scotland but it is likely that other regulatory bodies, including those in Scotland will be encouraged to follow suit.

The Scottish Qualifications Authority (SQA) has issued guidance on its approach to using generative AI. This guidance focuses predominately on students’ use of AI during assessments, particularly avoiding plagiarism, but there have been no firm policy decisions taken on regulating AI in Scotland’s education sector.

With two of England’s biggest regulators in education committed to implementing the principles of the UK AI Regulation White Paper, it is hoped that other regulators, across the jurisdictions, will follow. This will allow for uniformity in the approach to AI regulation and will encourage public confidence in the use of AI in education.

STAY INFORMED