The English schools looking to dispel ‘doom and gloom’ around AI


Charles Darwin chatting with students about evolution, primary school pupils seeing their writing transformed into images, Luton reimagined as a cool automobile – artificial intelligence is invading schools across England in surprising ways.

While Bridget Phillipson, the education secretary, in January called for a “digital revolution” involving AI in schools, it has already begun in places such as Willowdown primary school in Bridgwater, Somerset.

Matt Cave, Willowdown’s head teacher, said his pupils improve their descriptive writing by feeding their work into an AI client to generate images.

“All of a sudden they’ve got all these pictures from different people’s descriptions, and they can then discuss with their classmates whether that was the image they expected to be in the reader’s head,” Cave said.

“It was really stimulating and thought-provoking for them to have a different audience.”

The results, according to Cave, have been “brilliant” and a contrast to the “doom and gloom” he had heard from worried school leaders.

“I wouldn’t want anyone to think we weren’t aware of the potential hazards – we emphasise that to the children continually. But it’s going to be a tool that they are going to need to use all their lives,” Cave said.

“In Bridgwater we’ve got Hinkley Point being built, the new nuclear power station, and Gravity, which is a massive gigafactory for batteries for Jaguar Land Rover. That’s all going to be hi-tech businesses and children are going to need to know this stuff to get on with employment in the local area.”

Marina Wyatt, head of science for key stage 3 at Furze Platt senior school in Maidenhead, said that she has found teacher-led use of AI useful for engaging students in discussions, including with a virtual Charles Darwin.

“We prompt the AI before we take the class – we tell it: ‘Imagine you are Charles Darwin, you have students from a science class who are interested in your experience around the world, they particularly want to know about the theory of evolution, natural selection, variation and inheritance.’

“In the prompt we tell it to respond as Charles Darwin, and stay in the role. And it works. It came up with some brilliant stuff.

“Children who often don’t have the opportunity to participate, for one reason or another, were hooked on this and were asking questions like crazy,” Wyatt said.

Wyatt could screen the ChatGPT Darwin’s answers to the students’ questions before playing them aloud to the class, allowing her to avoid inaccuracies or bias.

Wyatt said students were not given direct access to using AI while the school was developing policies for its use, including parental consent and data security.

Daisy Christodoulou, head of education for No More Marking, a firm adapting technology for classroom assessments, said the exciting uses of AI and large language models (LLMs) came with concerns about the effects on how pupils learn.

“The first problem – and most fundamental problem – is a basic principle from cognitive science: learning is not performance,” Christodoulou said.

“What this means is that the fundamental underpinning skills and knowledge you need to become an expert user of LLMs – or indeed any modern technology – are often not best acquired simply by playing around with the technology.

“A lot of adults find LLMs useful because they already have the basic literacy, numeracy and background knowledge to make sense of their outputs. Twenty years ago we heard a lot of hype about how you didn’t need to know anything because you could just look it up on Google. That was wrong, and we are in danger of repeating the error with LLMs.”

Emma Darcy, director of technology for learning at Denbigh high school in Luton, said the uses and pitfalls of AI were taught in weekly “digital character” classes for year 7 pupils.

“After the explosion of ChatGPT two years ago, we didn’t want to wait for official guidance to come out because we knew we needed to be having those conversations with our staff and students,” Darcy said.

“We’ve got a student AI steering group as well that meets monthly. We thought it was important because young people are the end users of the technology but don’t get a voice in how it’s used in school.”

But the school has enabled controlled use of AI in some cases, utilising Canva graphic design software.

“We did a big project with the whole school around presenting positive images of Luton and we asked the students to use Canva to help generate an image of a car representing Luton and the Luton community,” Darcy said.

“But what we were actually teaching was language and literacy skills, what a good prompt would look like and image generation. What we don’t do is send students directly on to an LLM – it needs to be done with a clear learning objective and purpose.”



READ SOURCE