In this episode, host Philippa Wraithmell speaks with Laura Knight, Founder of Sapio Ltd. They explore safeguarding in the age of AI, focusing on digital responsibility, student wellbeing, and ethical technology use. Laura shares why safeguarding is a shared responsibility between schools and parents, and explains how education must move beyond fear to build digital literacy, self-regulation, and resilience in young people.
Shared responsibility between parents and educators
Digital literacy and self-regulation
Risks of AI, including synthetic relationships
Screen time vs content quality debate
Emotional wellbeing in digital environments
Scenario-based learning for safeguarding
Ethical AI and student responsibility
Why Listen to This Episode?
This episode explains how safeguarding must change in an AI-driven world. It goes beyond basic rules and focuses on real challenges like deepfakes, online relationships, and AI misuse. You will learn practical ways to support students, improve digital literacy, and create safer learning environments using a human-centred approach.
Who This Episode Is For?
This episode is for school leaders, teachers, safeguarding leads, parents, and EdTech professionals. It is especially useful for those working on digital safety, AI policy, and student wellbeing. Anyone responsible for protecting young people in a digital world will find valuable insights and practical strategies.
Full Episode Description
Artificial intelligence is changing education quickly, but it also brings new risks. In this episode, Laura Knight explains why safeguarding must evolve to match this new reality.
One key message is that safeguarding is a shared responsibility. Schools cannot manage everything alone, especially when students use devices outside school. Parents, educators, and communities must work together to support young people. Without this collaboration, there is a growing gap in responsibility that puts students at risk.
Laura highlights that many risks today are new and harder to predict. For example, AI tools can create deepfake content or build emotional connections with users. These tools can feel real, especially for young people, making it difficult for them to understand boundaries. This creates serious concerns around safety, trust, and wellbeing.
The conversation also challenges common ideas about screen time. Laura explains that screens are not the real problem. Instead, the focus should be on what students are doing online. Learning, creativity, and communication can all happen through screens. The real issue is understanding content, behaviour, and digital habits.
A key solution discussed is teaching self-regulation and digital literacy. Students need to learn how to manage their own behaviour online, make good decisions, and understand risks. This includes knowing when to question information, when to stop, and how to respond in difficult situations.
Laura also introduces scenario-based learning as a powerful tool. By discussing real-life situations, schools can help teachers and students understand risks better and build confidence in handling them. These conversations create clarity and reduce fear.
Finally, Laura shares insights from her white paper, which highlights three important pillars: capability, conscience, and courage. These help students build skills, develop strong values, and make responsible choices in a digital world.
Overall, this episode shows that safeguarding is not just about rules. It is about preparing young people to live safely, responsibly, and confidently in a world shaped by AI.
Philippa Wraithmell is an education and digital-learning strategist based in the UAE. As the founder of EdRuption and Digital Bridge, she leads work on digital wellbeing, innovation, and evidence-informed practice. As host of The EdTech Podcast, Philippa explores how technology can elevate teaching, learning, and equitable education across the globe.
Laura Knight is Founder and CEO of Sapio Ltd, specialising in AI, digital strategy, and innovation in education. A TechWomen100 Award Winner, she is an international speaker and author focused on ethical technology. Laura works with schools and organisations worldwide to support responsible AI use, digital safeguarding, and effective EdTech strategies.
#322
Safeguarding in the Age of AI: Who’s Responsible?
Subscribe us on
Episode Overview
In this episode, host Philippa Wraithmell speaks with Laura Knight, Founder of Sapio Ltd. They explore safeguarding in the age of AI, focusing on digital responsibility, student wellbeing, and ethical technology use. Laura shares why safeguarding is a shared responsibility between schools and parents, and explains how education must move beyond fear to build digital literacy, self-regulation, and resilience in young people.
Key Themes in This Episode
Why Listen to This Episode?
This episode explains how safeguarding must change in an AI-driven world. It goes beyond basic rules and focuses on real challenges like deepfakes, online relationships, and AI misuse. You will learn practical ways to support students, improve digital literacy, and create safer learning environments using a human-centred approach.
Who This Episode Is For?
This episode is for school leaders, teachers, safeguarding leads, parents, and EdTech professionals. It is especially useful for those working on digital safety, AI policy, and student wellbeing. Anyone responsible for protecting young people in a digital world will find valuable insights and practical strategies.
Full Episode Description
Artificial intelligence is changing education quickly, but it also brings new risks. In this episode, Laura Knight explains why safeguarding must evolve to match this new reality.
One key message is that safeguarding is a shared responsibility. Schools cannot manage everything alone, especially when students use devices outside school. Parents, educators, and communities must work together to support young people. Without this collaboration, there is a growing gap in responsibility that puts students at risk.
Laura highlights that many risks today are new and harder to predict. For example, AI tools can create deepfake content or build emotional connections with users. These tools can feel real, especially for young people, making it difficult for them to understand boundaries. This creates serious concerns around safety, trust, and wellbeing.
The conversation also challenges common ideas about screen time. Laura explains that screens are not the real problem. Instead, the focus should be on what students are doing online. Learning, creativity, and communication can all happen through screens. The real issue is understanding content, behaviour, and digital habits.
A key solution discussed is teaching self-regulation and digital literacy. Students need to learn how to manage their own behaviour online, make good decisions, and understand risks. This includes knowing when to question information, when to stop, and how to respond in difficult situations.
Laura also introduces scenario-based learning as a powerful tool. By discussing real-life situations, schools can help teachers and students understand risks better and build confidence in handling them. These conversations create clarity and reduce fear.
Finally, Laura shares insights from her white paper, which highlights three important pillars: capability, conscience, and courage. These help students build skills, develop strong values, and make responsible choices in a digital world.
Overall, this episode shows that safeguarding is not just about rules. It is about preparing young people to live safely, responsibly, and confidently in a world shaped by AI.
Podcast Host By :
Special thanks to Guests :
Subscribe us on
Post List
Safeguarding in the Age of AI: Who’s Responsible?
The Future of Child Online Safety: Insights from Ofcom & LGfL
Trending
#271 – Cutting Through the White Noise Around AI for Education
#255 – Does It Work? Evidence in EdTech
#250 – NeuroDiversity in Education and Entrepreneurship
related posts :
Navigating the Future of AI Education with BBC Bitesize
AI in Education: Governance, Ethics and Sustainability
The Future of Education Data: Trends and Tools That Empower
The Future of Child Online Safety: Insights from Ofcom & LGfL
Agility in Action: Schools Responding Across the Middle East
Navigating EdTech: Inclusion, Investment, and Student Safety