Generative artificial intelligence (AI) and the associated large language models (LLMs) used by services such as ChatGPT and Google Bard may prove to be hugely beneficial tools in the classroom, but schools will need to review and strengthen their cyber security postures and teachers will need to work harder to safeguard students, the Department for Education (DfE) has said.
The DfE has today published a statement on the use of generative AI in schools and universities – informed by Westminster’s AI regulation whitepaper, also published today – setting out its position on the rapidly evolving, high-profile technology. It said it recognised that generative AI posed “opportunities and challenges” for the sector.
Speaking at the Bett show in London, education secretary Gillian Keegan said: “AI will have the power to transform a teacher’s day-to-day work. We’ve seen people using it to write lesson plans, and some interesting experiments around marking.
“Can it do those things now, to the standard we need? No. Should the time it saves ever come at the cost of the quality produced by a skilled teacher? Absolutely not,” she said.
“But could we get to a point where the tasks that really drain teachers’ time are significantly reduced? I think we will.
“Getting to that point is a journey we in this room are going to have to go on together – and just as we’ve responded to other innovations like the calculator, we’ll use it to deliver better outcomes for students,” said Keegan.
In the full statement, the DfE said that used appropriately, generative AI had the potential to become a useful administrative assistant for teachers, reducing the time they spend on non-pupil-facing activities and allowing them to focus more on their core education mission.
However, it said, teachers would need to be aware of the data and privacy implications when using it for such purposes. For example, taking care not to allow it to access personal or sensitive data on the students in their care, and understanding that any data they do enter should be considered released onto the internet.
Gillian Keegan, DfE
Teachers will also need to develop enhanced cyber security awareness, said the DfE. For example, the use of LLMs to generate increasingly convincing phishing emails could put schools at greater risk of falling victim to serious cyber incidents up to and including ransomware attacks.
The DfE also warned that such tools could produce unreliable information, and would therefore need to be checked to ensure appropriateness and accuracy – doubly important given that, in many cases, a given tool will not have been trained on the English curriculum.
It said that if generative AI is to be used to produce administrative plans, policies or documents, the person who produced it must remain professionally responsible for it.
Turning to homework and other forms of unsupervised study, the DfE is recommending schools begin to review their existing policies in this area to account for the widespread public availability of generative AI.
This will include taking steps to prevent malpractice by students who may be tempted to use generative AI to produce formally assessed coursework that is not their own.
In circumstances where it may be deemed appropriate for students to use generative AI, they will also need to be equipped with the knowledge and skills to judge whether the material it produces is accurate and appropriate. Teachers will have a duty to ensure that students are not using AI tools to access or generate harmful or inappropriate content.
Schools can help in this regard by enhancing their IT teaching to include the safe and appropriate use of AI by helping students understand the limitations, reliability and bias of AI, how information on the internet is organised and ranked, and online safety to guard against harmful or misleading content. The DfE committed to a programme of work to support schools in this.
Support package
The DfE is also today announcing a support package to help schools build “safe, secure and reliable” foundations to enable them to use more powerful technology.
These additions to existing digital and tech standards – covering the use of cloud services, servers and storage, and filtering and monitoring – will supposedly help schools save money and create secure learning environments.
They are accompanied by a new digital service to help sector leaders in their technology planning. This will benchmark their tech against digital standards, suggest areas of improvement, and offer actionable steps and self-service implementation guidance.
Schools in Blackpool and Portsmouth – both Priority Education Investment Areas – will pilot these services in the autumn term of 2023, prior to a roll-out across the country.