What is AI?
Artificial Intelligence (AI) is not one specific form of technology but a collection of methods and techniques to build an AI system.
AI systems are often used to automate processes, replace or complement human decision-making.
A broad definition of AI adopted by UNESCO is: “systems with the ability to process data in a way which resembles intelligent behaviour”.
This definition was adopted because, as UNESCO point out, “the rapid pace of technological change would quickly render any fixed, narrow definition outdated”.
Staff and learners are already regularly using AI systems, often without realising. Search engines, smart assistants, chatbots, translation tools, navigation apps, online games and many other applications use forms of AI.
What is Generative AI (GenAI)?
Much of the recent interest and debate about AI and its impact on education centres on this field of AI technology.
GenAI is an AI technology which generates content in response to written prompts. It can produce new content in various formats: texts, images, videos, music and software code. Tools such as ChatGPT, DALL-E, Claude and Google’s Gemini are all forms of GenAI.
The new content produced by these tools is not based on real understanding: GenAI “generates its content by statistically analysing the distributions of words, pixels or other elements in the data that it has ingested and identifying and repeating common patterns”. It can be said that GenAI works in a similar, but much more advanced way, to predictive text.
This means that, while GenAI produces new content, as UNESCO point out, it “cannot generate new ideas or solutions to real-world challenges, as it does not understand real-world objects or social relations that underpin language. Moreover, despite its fluent and impressive output, GenAI cannot be trusted to be accurate.”
It is important to remember that GenAI has an inbuilt tendency to invent factually inaccurate or untrue information in response to user prompts. These inventions or misrepresentations are referred to as “hallucinations”. They often appear plausible because GenAI is designed to produce fluent, coherent content.
GenAI has been the subject of particular interest in relation to education and schools due to its ability to produce content which can be used by education institutions, staff and pupils.
What does the NEU think about AI in education?
We want:
We think AI technologies should be harnessed to realise benefits to education by giving staff more free time and professional control and by enhancing teaching and learning.
We do not want:
The technology should not be used to increase staff-pupil ratios, drive cost savings or profits, enable employers to intensify or increase the volume of work in other areas, or impose pedagogical or curriculum approaches on staff and students.
Technology, including AI, has the potential to help improve teaching and learning, but it can also have detrimental impacts.
Much depends on the priorities and ideology that guide its development, adoption and implementation. This includes the commercial goals and motivations of the companies producing and selling the technology.
It is often claimed that AI and other forms of edtech will reduce workload. However, this promise has not always materialised: teachers and leaders’ work is sometimes displaced and shifted towards more routine tasks and the administration of the technology itself. The overall result can be to devalue teacher expertise – the loss of agency and autonomy – as well as a reduction in education quality.
Currently there is too much pressure on schools from a narrow and oppressive accountability system, underfunding and excessive workload. These constraints, as well as a school system that is increasingly top down, mean that staff development and voice is often limited when it comes to engaging with technology. At the same time, these pressures increase the vulnerability of educators to tech tools marketed and presented as time savers.
With the rush to develop and market AI to schools, there is a risk that it may become even harder for staff to navigate the use of technology and ensure it is adopted in ethical ways and in the best interests of students and staff.
Creating space for greater educator agency and negotiation in relation to technology is therefore more vital than ever.
What can AI be used for?
Staff will have different levels of confidence and experience with different types of technology. Staff need time to learn how to use the AI tools.
NEU members have told us that teachers are using GenAI tools to:
- Quickly produce content such as presentations which they then review and adjust if needed.
- Develop resources for lessons such as quizzes.
- Aid in the development of lesson plans.
- To aid in summarising or differentiating texts for different resources.
We advise that if you create learning resources using GenAI tools, that you always carefully check, and consider the output, as GenAI will produce factually inaccurate content and will tend to recreate biases in its results.
Think of what AI tools produce as a 'first draft' and use your professional judgement, subject knowledge, and expertise to review and critique the content to create a final resource.
Where GenAI is used to summarise or produce different versions of texts, we advise that you check that what it has composed accords with your understanding of the meaning of the original text.
What about data protection and privacy?
AI systems require the continuous and ongoing collection of user data (in education this would primarily be pupil data, but could also be the data of staff, parents or other stakeholders). The amount of data processed and the centrality of the data to developing GenAI mean there are more risks in relation to data privacy and the potential exploitation of data for commercial purposes.
Campaigning organisations such as Defend Digital Me have highlighted that there are significant gaps in governance of data processing in education contexts, with a lack of accountability for staff, learners and parents in the system as a whole. Therefore, there is a need to improve these systems and the legal framework for protecting pupil data as AI is adopted more widely.
What about biases and prejudices?
GenAI systems will replicate biases which exist in their training data. Most GenAI products have been trained on the internet, and their output will likely reproduce “opinions that are most common or dominant”. There are also risks that use of GenAI will make “writing more homogeneous, as its suggestions by definition reproduce standard ways of expressing ideas and normative values”.
There are potentially very serious implications if AI systems are involved in decisions that will have an impact on students, including assessing student outcomes or progress and recommending interventions or support.
Will AI help with personalisation?
The promise of greater personalised and tailored learning has been touted as one of the potential benefits of AI and edtech in general but the NEU thinks this is not necessarily the outcome. Academics such as Wayne Holmes and others have highlighted that the vision of learning that relies on AI is often quite limited:
"One of the significant drawbacks of AI-enabled personalised learning … is the potential erosion of social interactions in education, critical for fostering trust, motivation, and engagement. Meanwhile, by overly emphasising individual learning paths, it can actually undermine students’ self-actualisation, leading to homogenised learning outcomes. It can also downplay the crucial role of education in community-building and social skills development, ignore the holistic development of students, and potentially perpetuate socio-economic and cultural disparities."
What GenAI products are available?
It is important to consider the possible costs of using products, including set-up and ongoing fees, how they are licenced and whether this makes them a viable option for use. Many GenAI products that are currently free or cheaply available may increase in cost, and, as the DfE itself acknowledges, “privately-operated AI tools are able to change terms of use at any time and lack accountability and transparency."
There are a range of different AI tools available, and AI is increasingly being integrated into existing platforms, such as Google Classroom and Microsoft packages.
Due to the constantly changing nature of the technology, we advise informed use and collective discussions in your school/setting.
Schools and educators should be cautious about selecting AI-based products marketed as new or unique: many of the AI products marketed to schools are in fact underpinned by the same foundational GenAI models. No AI products have been independently evaluated in relation to education; there is no robust research on their use in schools.
What about student use of AI?
Using AI applications will become part of many students’ daily reality, and staff engage with the issues they raise and help prepare students to navigate learning in this context. To do this, staff themselves will need support, CPD and training to develop their understanding of AI, how it can be used in the classroom and effective strategies and approaches to educating about it, as well as managing its use by students.
Use of GenAI by students has provoked widespread concern about cheating in written work, and the dangers of relying on GenAI which can reproduce false or factually inaccurate information.
The NEU does not believe 'banning' students from using AI for their work is likely to be desirable or effective. We would suggest schools adopt a policy which guides students to use AI appropriately and ethically - including by critically analysing the content it produces - and to properly acknowledge and reference their use of AI.
Students should be encouraged to think critically about the technology, how it works, who produces and designs it, and how it uses and processes data.
This will form an important part of developing students’ digital literacy in the context of AI, as well as how to assess and engage with different sources of information. ChatGPT and other forms of GenAI could be an opportunity to discuss “disinformation”, biases and learn about how to critically assess and use different sources of information to inform your thinking.
What about AI and plagiarism?
The widespread use of GenAI by students, in particular use of AI-generated content within written work, is a growing trend.
Unfortunately, automated systems for detection of AI use by students in their written work have been shown to be ineffective and unreliable. Some systems have even been shown to disadvantage students whose first language is not English, flagging their work as potentially containing plagiarism.
This means staff will continue to need to rely on their professional knowledge and expertise to judge factual accuracy and whether the student is engaging in plagiarism.