Staff will have different levels of confidence and experience with different types of technology and need time to learn how to use the available artificial intelligence tools.
NEU members have told us that teachers are using GenAI tools to:
- Quickly produce content such as presentations which they then review and adjust if needed.
- Develop resources for lessons such as quizzes.
- Aid in the development of lesson plans.
- To aid in summarising or differentiating texts for different resources.
We advise that if you create learning resources using GenAI tools, that you always carefully check, and consider the output, as GenAI will produce factually inaccurate content and will tend to recreate biases in its results.
Think of what AI tools produce as a 'first draft' and use your professional judgement, subject knowledge, and expertise to review and critique the content to create a final resource.
Where GenAI is used to summarise or produce different versions of texts, we advise that you check that what it has composed accords with your understanding of the meaning of the original text.
Data protection and privacy
AI systems require the continuous and ongoing collection of user data (in education this would primarily be pupil data, but could also be the data of staff, parents or other stakeholders). The amount of data processed and the centrality of the data to developing GenAI mean there are more risks in relation to data privacy and the potential exploitation of data for commercial purposes.
Campaigning organisations such as Defend Digital Me have highlighted that there are significant gaps in governance of data processing in education contexts, with a lack of accountability for staff, learners and parents in the system as a whole. Therefore, there is a need to improve these systems and the legal framework for protecting pupil data as AI is adopted more widely.
Bias and prejudice
GenAI systems will replicate biases which exist in their training data. Most GenAI products have been trained on the internet, and their output will likely reproduce “opinions that are most common or dominant”. There are also risks that use of GenAI will make “writing more homogeneous, as its suggestions by definition reproduce standard ways of expressing ideas and normative values”.
There are potentially very serious implications if AI systems are involved in decisions that will have an impact on students, including assessing student outcomes or progress and recommending interventions or support.
AI and personalisation
The promise of greater personalised and tailored learning has been touted as one of the potential benefits of AI and edtech in general but the NEU thinks this is not necessarily the outcome. Academics such as Wayne Holmes and others have highlighted that the vision of learning that relies on AI is often quite limited:
"One of the significant drawbacks of AI-enabled personalised learning … is the potential erosion of social interactions in education, critical for fostering trust, motivation, and engagement. Meanwhile, by overly emphasising individual learning paths, it can actually undermine students’ self-actualisation, leading to homogenised learning outcomes. It can also downplay the crucial role of education in community-building and social skills development, ignore the holistic development of students, and potentially perpetuate socio-economic and cultural disparities."
Available GenAI products
It is important to consider the possible costs of using products, including set-up and ongoing fees, how they are licenced and whether this makes them a viable option for use. Many GenAI products that are currently free or cheaply available may increase in cost, and, as the DfE itself acknowledges, “privately-operated AI tools are able to change terms of use at any time and lack accountability and transparency."
There are a range of different AI tools available, and AI is increasingly being integrated into existing platforms, such as Google Classroom and Microsoft packages.
Due to the constantly changing nature of the technology, we advise informed use and collective discussions in your school/setting.
Schools and educators should be cautious about selecting AI-based products marketed as new or unique: many of the AI products marketed to schools are in fact underpinned by the same foundational GenAI models. No AI products have been independently evaluated in relation to education; there is no robust research on their use in schools.
Student use of AI
Using AI applications will become part of many students’ daily reality, and staff engage with the issues they raise and help prepare students to navigate learning in this context. To do this, staff themselves will need support, CPD and training to develop their understanding of AI, how it can be used in the classroom and effective strategies and approaches to educating about it, as well as managing its use by students.
Use of GenAI by students has provoked widespread concern about cheating in written work, and the dangers of relying on GenAI which can reproduce false or factually inaccurate information.
The NEU does not believe 'banning' students from using AI for their work is likely to be desirable or effective. We would suggest schools adopt a policy which guides students to use AI appropriately and ethically - including by critically analysing the content it produces - and to properly acknowledge and reference their use of AI.
Students should be encouraged to think critically about the technology, how it works, who produces and designs it, and how it uses and processes data.
This will form an important part of developing students’ digital literacy in the context of AI, as well as how to assess and engage with different sources of information. ChatGPT and other forms of GenAI could be an opportunity to discuss “disinformation”, biases and learn about how to critically assess and use different sources of information to inform your thinking.
AI and plagiarism
The widespread use of GenAI by students, in particular use of AI-generated content within written work, is a growing trend.
Unfortunately, automated systems for detection of AI use by students in their written work have been shown to be ineffective and unreliable. Some systems have even been shown to disadvantage students whose first language is not English, flagging their work as potentially containing plagiarism.
This means staff will continue to need to rely on their professional knowledge and expertise to judge factual accuracy and whether the student is engaging in plagiarism.