The emergence of free-to-access generative AI programmes has the potential to transform the classroom in ways we’re only beginning to understand, how can we protect students from the dangers of AI
The internet, perhaps the last major technological innovation that’s comparable to AI, changed the way that students communicate, work, and access information. But tools like ChatGPT appear to be doing much more than that. They’re changing the way that students live, work, and interact with technology every day.
Just as with the internet, it’s difficult to predict now what this technology may lead to in a decade’s time. What is clear is students have been very quick to utilise these tools to support their school work. Recently commissioned research from RM Technology reveals that over two thirds of school students are using the technology frequently, with some using it to solve mathematics problems, write English essays, translate texts in foreign languages, and more.
Despite the huge opportunities AI tools offer, many students are also struggling with its rapid introduction into their education.This is an aspect of the technology that too few are talking about. And we’ll need to do exactly that if we’re to guide children towards using the technology efficiently and safely, and protect them from the dangers which come from others misusing AI.
Ultimately, anyone with a stable internet connection and a computer is now able to produce harmful multimedia content, which could have a severe impact on a student’s wellbeing.
Providing ‘safety rails’ is key
While the majority of school children are using AI to help with their school work, its popularisation has also increased their anxiety about education. Seventy per cent are now worrying that they’ll struggle in exams without the support of AI tools. Although 68% of students say they are now achieving better grades since they started using it, at least half of those children are experiencing guilt about doing so.
It’s clear that there’s a significant lack of guidelines for the use of AI in schools. Definite ‘safety rails’ that describe effective practice could help to dispel anxiety and reduce guilt. They would ensure that, in whatever way students use AI, it will not hamper their potential in exams.
Introducing effective safety rails, though, is going to present a considerable challenge to the sector. The technology is advancing at a pace that regulators struggle to match. The Online Safety Bill now passing through the House of Lords may soon be outdated.
Legislation and guidance take time and experience to draft. With AI, we have little of either.
The public and private sectors need to come together to ensure that schoolchildren in the UK are protected from the potential harms of this new technology.
Parents and teachers must be equipped so they can work together to safeguard students
Government regulation will, without a doubt, prove crucial in guiding students’ use of AI in schools. Nevertheless, other stakeholders will need to play their part, since effective regulation remains a faint reality at some point in the distant future.
That’s where teachers come in. Yet, it can be difficult without any guidance or training for teachers to effectively guide students’ use of AI. Indeed, just over a third of teachers believe that students have a better understanding of AI than they do.
Schools must work with their technology partners to implement comprehensive training plans that cover every aspect of the use of AI, both in the classroom and outside it. Teachers must be equipped with the knowledge to decide how and where it is appropriate and beneficial, and how to spot when a student is misusing AI.
Safeguarding outside the classroom
Outside the classroom, it becomes difficult for teachers to guide and protect their students while they use AI. But parents will play an important role, since teachers can only do so much once a child has left the classroom.
Communication between parents and teachers is therefore vital if students are to be safeguarded outside the classroom. Informative material around the schools policy on the use of AI and how parents can safeguard their children is a great example of how schools can work with parents to ensure students continue to have a safe experience with AI outside the classroom.
It is clear that the education sector needs to introduce guidelines for students’ use of AI – and quickly. What we must also recognise is that government regulation may take a long time before it can offer that guidance and protection. Teachers, parents, and the private all have an important role to play. And through effective training and collaboration, they can ensure children can continue to learn in a safe and productive environment.
This piece was written and provided by Mel Parker, Educational Technologist at RM Technology