robots AI tech in schools

Artificial intelligence in education

Artificial intelligence (AI) in education is a complex and controversial issue. For each potential use, there are also substantial risks and limitations.

Education staff must be involved in discussions about the safe and effective use of AI to support and enhance teaching.

Artificial Intelligence (AI) is not one specific form of technology but a collection of methods and techniques to build an AI system, often used to automate processes, replace or complement human decision-making.

A broad definition of AI adopted by UNESCO is: “systems with the ability to process data in a way which resembles intelligent behaviour”.

This definition was adopted because, as UNESCO point out, “the rapid pace of technological change would quickly render any fixed, narrow definition outdated”.

Staff and learners are already regularly using AI systems, often without realising. Search engines, smart assistants, chatbots, translation tools, navigation apps, online games and many other applications use forms of AI.

Generative AI (GenAI)

Much of the recent interest and debate about AI and its impact on education centres on this field of AI technology.

GenAI is a technology which generates content in response to written prompts. It can produce new content in various formats: texts, images, videos, music and software code. Tools such as ChatGPT, DALL-E, Claude and Google’s Gemini are all forms of GenAI.

The new content produced by these tools is not based on real understanding: GenAI “generates its content by statistically analysing the distributions of words, pixels or other elements in the data that it has ingested and identifying and repeating common patterns”. It works in a similar, but much more advanced way, to predictive text. 

While GenAI produces new content, as UNESCO point out, it “cannot generate new ideas or solutions to real-world challenges, as it does not understand real-world objects or social relations that underpin language. Moreover, despite its fluent and impressive output, GenAI cannot be trusted to be accurate.” 

GenAI has an inbuilt tendency to invent factually inaccurate information, or “hallucinations”, in response to user prompts. These can appear plausible because GenAI is designed to produce fluent, coherent content. 

GenAI is of particular interest in education due to its ability to produce content which can be used by education institutions, staff and pupils.

NEU on AI in education

We want:

We think AI technologies should be harnessed to realise benefits to education by giving staff more free time and professional control and by enhancing teaching and learning. 

We do not want:

The technology should not be used to increase staff-pupil ratios, drive cost savings or profits, enable employers to intensify or increase the volume of work in other areas, or impose pedagogical or curriculum approaches on staff and students. 

Technology, including AI, has the potential to help improve teaching and learning, but it can also have detrimental impacts.

Much depends on the priorities and ideology that guide its development, adoption and implementation. This includes the commercial goals and motivations of the companies producing and selling the technology.

It is often claimed that AI and other forms of edtech will reduce workload. However, this promise has not always materialised: teachers and leaders’ work is sometimes displaced and shifted towards more routine tasks and the administration of the technology itself. The overall result can be to devalue teacher expertise – the loss of agency and autonomy – as well as a reduction in education quality.

Currently there is too much pressure on schools from a narrow and oppressive accountability system, underfunding and excessive workload. These constraints, as well as a school system that is increasingly top down, mean that staff development and voice is often limited when it comes to engaging with technology. At the same time, these pressures increase the vulnerability of educators to tech tools marketed and presented as time savers. 

With the rush to develop and market AI to schools, there is a risk that it may become even harder for staff to navigate the use of technology and ensure it is adopted in ethical ways and in the best interests of students and staff.

Creating space for greater educator agency and negotiation in relation to technology is therefore more vital than ever.

What can I do to improve policies around AI in schools?

A place to start is always by talking to your colleagues and discussing the issue, asking about how and whether they use AI, what their experience has been. 

There may be opportunities to use discussion about AI to open up wider questions about how technology is used in your school. 

The following principles are a guide for arguing for greater professional expertise and judgement in decisions around technology in schools:

  • Schools should adopt and use digital technologies to support and enhance teaching and learning in ways that support equality, equity and inclusion.
  • Staff and their unions should be engaged in decisions about the design, development, procurement, implementation, review and continued use of digital technologies.
  • An ongoing process of monitoring, review and evaluation of the use and impact of digital technologies, should include active engagement of staff unions and include workload considerations.
  • Teachers should have sufficient time to plan and prepare for the incorporation of new digital technologies. They should also have time to assess the usefulness of digital technologies once they are in place. 
  • Staff should have specific guidance, support and CPD for any new technology which should be updated regularly.
Back to top