Dr Ruth Phillips, Academic-in-Residence
Artificial Intelligence is a topic that seems to be everywhere, and education is no exception. As with any new technology, our initial reaction is often one of suspicion—a trait that has helped humanity survive for millennia. Alongside this suspicion, there is also a mix of hype, fear, and exaggerated claims about both the positive and negative impacts of Artificial Intelligence (AI). Like all new discoveries, AI in education comes with its own set of pros and cons.
The truth is, at this point, we simply don’t know enough yet to fully understand its implications. AI is already transforming education and workplaces, helping to reduce workloads and boost productivity. But for educators, the crucial question remains: how does AI impact student learning and wellbeing? To address this, Professor Matt Bower collaborated with the staff at Frensham Schools to engage with and better understand AI in education, and to critically evaluate its role in our school.
Should we use AI tools? Yes, but…
Artificial intelligence contains implicit biases, can produce work with inaccuracies and often lacks transparency on data sources. This means we, as educators, must provide responsible and ethical policies to guide the use of these tools for young people.[1][3] Through the creation of such guidelines for students, we can encourage thoughtful application of AI technologies. The temptation for young people will be to get work over and done with rather than using AI to increase competence. Imagine being 15 years old, having an assignment due tomorrow and you haven’t done enough work on it. Now imagine that you found someone or something that would do the task for you, often for free, in seconds – would you have been tempted to use it?
Instead of avoiding AI, we must learn to use it wisely—ethically, critically, and with purpose.
Using generative AI can benefit learners:
• helping students find personalised curriculum pathways,
• producing customised learning content,
• providing information about student progress,
• informing teacher decision making,
• contributing to learning design processes, and
• assisting in assessment and feedback.[2]
Will it make us less intelligent? Perhaps if we don’t…
Critical thinking is arguably the most important skill that young people can take into the workplace. There is evidence that the ability to think critically is impacted when we use AI to create fast, surface-level responses rather than thinking deeply and slowly.[4] If we rely too heavily on AI tools, users can fall into the trap overusing cognitive shortcuts, or heuristics, and may not engage in critical, or higher order thinking. [5]
To mitigate this risk, learning experiences using AI tools must require students to critically evaluate AI-generated and synthesised content, compare it with human-generated
insights, and develop an understanding of the limitations of AI outputs. Tasks which require reflection on the biases present in AI algorithms and data sets will assist students in learning how to critically evaluate information from diverse sources, in turn allowing them to develop a more nuanced use of AI tools. [6]
Yong Zu’s research emerging from Harvard also emphasises the need for human interaction. She suggests that while AI can simulate some educational interactions, it cannot replicate the deeper engagement and relationship-building of human interaction, particularly when it comes to follow-up questions or personalised conversations essential for language and social development.[7] Personalised learning is at the centre of the Frensham Schools educational philosophy and is inimitable by AI.
How do we support our young people in this space?
We will ask good questions:
• Is AI the best way to go about this task?
• How does this compare with other information we have on this topic? • How might we check that this information is not fake?
• How might we critically assess the impact of fake information in this area?
We will model use for appropriate tasks:
• Compare AI-generated information from different platforms, identify inconsistencies, and analyse why they occur,
• Require the use of AI to be referenced like you would any other resource,
• Create tasks in which AI may not be useful,
• Provide students with AI-generated information and challenge them to apply it to a complex, open-ended problem with no single correct answer. Design learning which requires human creativity and critical thinking.
• Students learning philosophical thinking, for example in Year 8, need their critical and creative thinking skills to consider issues about the human condition in ways that only humans can.
• Using specifically designed instructional and thinking strategies which AI cannot currently address.
At Frensham Schools we will be doing all of the above, including requiring students to work analogue, setting tasks where laptops and technology are not used. We will keep developing our understanding of the benefits and pitfalls of AI in education and we will pivot and adapt when needed.
