AI in Schools: What Safe Use Looks Like for Pupils and Staff

by | Feb 9, 2026

Introduction: Why schools are talking about AI now

Artificial intelligence is becoming part of everyday life, and schools are no exception.

We are increasingly seeing AI tools being used by pupils, often outside of school and without adult oversight. At the same time, staff are beginning to explore AI to support planning, workload, or administration.

For many schools, there is a sense of uncertainty around AI and it’s role within education and how to use and teach about it in an age-appropriate way. This is completely understandable. AI tools are developing quickly, and guidance is still evolving.

In this blog, we explore the key considerations schools are raising and offer practical guidance to support informed decision-making on this topic. We will even explore an example for teaching about AI in the classroom.

Teaching AI in Schools

What do we mean by “AI” in a school context?

When we talk about AI in schools, we are usually referring to tools that can generate responses, images, or text based on prompts from a user.

In practice, this often includes:

  • Chatbots that answer questions or hold conversations

  • Image generators that create pictures from written descriptions

  • Writing and homework tools that can suggest ideas, improve wording, or complete tasks

Many of these tools are designed for adults, but they are easy for children and young people to access, particularly outside of school.

It is also important to recognise that AI is not always a standalone app or website. Increasingly, it is built into platforms and software that schools and families already use, sometimes in ways that are not immediately obvious.

Because of this, pupils and staff may encounter AI without actively seeking it out or realising it is being used. This can make conversations about safe and appropriate use feel more complex.

Understanding what we mean by AI, and where it already appears in everyday digital tools, is a helpful starting point for building confidence and clarity in schools.

Why “safe use” matters in schools

AI tools raise different considerations to many of the digital tools schools are already familiar with.

Unlike pre-recorded content, AI responds to users in real time. It adapts its language, tone, and suggestions based on what is typed in. For children and young people, this can feel personal and convincing.

AI tools can also sound confident and authoritative, even when the information they provide is inaccurate or inappropriate. For pupils who are still developing critical thinking skills, this can make it difficult to recognise when something is wrong or misleading.

These features raise important safeguarding considerations. Pupils may place trust in AI responses, seeking advice on sensitive topics, receiving response that may not always be safe or suitable.

Children are still learning how to evaluate information they see online. This makes it especially important that any use of AI in a school context is guided by clear expectations and ongoing conversation.

Safeguarding risks schools should be aware of

AI tools are designed to generate responses that sound helpful and confident. While this can be useful, it also introduces safeguarding considerations that schools need to be aware of.

One key risk is misinformation. AI tools can produce responses that sound plausible but are inaccurate, incomplete, or misleading. For pupils who are still developing the ability to evaluate information, this can make it difficult to know what can be trusted.

Some AI responses can also feel emotionally supportive or persuasive. The language used may appear understanding or reassuring, even though it is generated by a ai that may have no understanding of context. For some pupils, this can create a sense of personal connection that feels real.

There is also the risk of over-reliance. Pupils may begin to trust AI advice or guidance in situations where adult support would be more appropriate. This can include sensitive topics where professional judgement and trusted relationships are essential.

Ongoing conversations can help schools reduce risk while supporting pupils to develop safe, thoughtful approaches to AI.

Staff use and confidence

Many staff are naturally curious about how AI tools might support planning or workload. For others, limited time and increasing demands mean that tools are sometimes explored informally, without clear guidance in place.

Clarity matters because it supports consistency across a school. When expectations are shared, staff are more confident about what is appropriate and how AI fits within safeguarding responsibilities.

Clear guidance also strengthens safeguarding. It helps staff understand which uses are supported, where boundaries sit, and when additional checks or conversations are needed. This reduces uncertainty and supports safer decision-making.

Professional confidence grows when staff are not left to make individual judgements in isolation. A shared understanding, developed through discussion and guidance, allows staff to use technology thoughtfully while feeling supported.

Creating space for open conversations about AI helps schools build confidence over time and maintain a consistent approach as technology continues to change.

 

Many schools have told us that staff conversations are a helpful starting point when navigating AI use.

We have created a short, practical AI in Schools: Safeguarding Discussion Checklist for Staff to support these discussions:

Download AI in Schools: Safeguarding Discussion Checklist for Staff Here

 

An example of safe AI use in practice

Pupils benefit from knowing when AI can be used and when it is not appropriate. Being clear about boundaries, particularly around learning and independent work, helps avoid confusion and supports responsible use.

It is important to acknowledge that using AI in schools raises real and valid concerns for teachers.

Time pressures, age restrictions and staff confidence all play a part. Not every classroom has the same level of access, and not every school is in a position to use AI tools in lessons.

The example below is not intended as a model for every setting. Instead, it offers one possible way schools might approach safe use where AI is already being encountered:

 

In a Year 6 classroom, pupils are learning how to plan a piece of persuasive writing.

The teacher explains that an AI tool will be used to support thinking, not to generate finished work. Pupils are shown how to ask the AI for ideas or structure, such as examples of persuasive language or ways to organise an argument.

Before using the tool, the class discusses a few shared expectations:

  • AI suggestions may not always be correct or appropriate

  • Any ideas taken from the tool need to be checked and adapted

  • Final writing must be the pupil’s own work

 

The teacher models how to question an AI response, highlighting where information might be unclear or unsuitable. Pupils are encouraged to talk through what sounds helpful and what does not.

During the lesson, the teacher remains involved, checking prompts and discussing responses with pupils. Pupils are reminded that if something feels confusing, worrying, or unexpected, they should raise it with an adult.

After the activity, the class reflects on the experience.

They discuss:

  • What the AI helped with

  • Where it was less helpful

  • Why adult guidance matters

 

Why this approach works well

This approach supports learning while reinforcing critical thinking and clear boundaries. It also helps pupils understand that AI is a tool to support learning, not a replacement for their own ideas or judgement.

Supporting schools moving forward

As AI continues to develop, it is understandable for schools to feel unsure about what comes next. Schools do not need to have all the answers immediately, and it is reasonable for understanding and practice to develop over time.

Taking gradual, informed steps can be more effective than trying to resolve everything at once.

This might begin with staff discussion or reviewing existing online safety education with AI in mind.

At OpenView Education, we suppoty schools to navigate emerging online safety challenges with our interactive online safety workshops and training.

Further Resources

Department for Education: Generative AI in Education – Guidance for Schools and Colleges
DfE guidance outlining how generative AI can be used safely and effectively in schools. It covers safeguarding, data protection, and staff responsibilities, making it a practical starting point for senior leaders and classroom teachers.

Department for Education: Keeping Children Safe in Education (KCSIE)
Statutory safeguarding guidance for schools in England. While not AI-specific, it provides the framework within which AI use should sit, particularly around online safety, filtering, monitoring, and staff responsibilities.

UK Safer Internet Centre – Guidance for Schools
Practical advice and resources to support schools in delivering online safety education. Useful for aligning AI conversations with broader digital literacy and critical thinking work.

ICO: AI and Data Protection Guidance
Guidance from the Information Commissioner’s Office on how AI interacts with UK GDPR and data protection law. Particularly helpful for school leaders considering staff use of AI tools or pupil data input.

Education Endowment Foundation: Using Digital Technology to Improve Learning
Evidence-informed guidance on effective digital technology use in classrooms. While broader than AI alone, it supports thoughtful implementation and avoids adopting technology without impact.

NSPCC Learning: Online Safety
Safeguarding-focused guidance that helps schools consider how emerging technologies fit within wider child protection responsibilities.

Ofsted: Education Inspection Framework
Provides context on how digital literacy, safeguarding, and curriculum intent are evaluated. Useful for leaders considering how AI fits within curriculum planning and safeguarding culture.

Group Chat Bullying Poster Schools

Group Chat Bullying: Practical Advice for Primary Schools

Group chats are a normal part of many children’s social lives. They can be fun, practical and a way to stay connected outside school. They can also be where unkind comments, peer pressure or misunderstandings happen quickly. In our work in primary and secondary...
4 Steps to Effective Online Safety Education

What Makes Online Safety Education Work? Four Principles for Schools

Online safety is something every school knows is important, yet finding the time and headspace to do it well can feel increasingly difficult. Teachers are already managing a full curriculum, safeguarding responsibilities, assessment pressures, and the day-to-day...
Online Safety for Schools - 5 Steps

Online Safety Training for Schools: How to Be Ofsted Ready

Online safety is now a core part of safeguarding, and a key focus in Ofsted inspections. Whether you’re preparing for inspection or strengthening your safeguarding culture, online safety training and internet safety workshops are essential to ensuring pupils, staff,...
Anti-Bullying Week Poster

Anti-Bullying Week Poster – Use Your Power for Good

Use Your Power for Good: A Simple, Action-Led Poster for Anti-Bullying Week 2025 Anti-Bullying Week 2025 is all about “Power for Good”, a positive, practical theme that reminds every pupil (and adult) that our words and actions can make school a kinder, safer place....
Anti-Bullying Week Checklist

Anti-Bullying Week Checklist: 5 Steps to Plan an Impactful Week in Your School

Anti-Bullying Week is one of the most important opportunities each year to create a positive school culture and show your commitment to pupil wellbeing. But many schools ask the same question: How do we plan Anti-Bullying Week in a way that’s effective, practical, and...