A few weeks before he was due to sit his first GCSE exam this summer, Ryan was finally rumbled. The teenager had been using ChatGPT to write his English literature coursework, changing a few bits here and there so the teachers wouldn’t get suspicious. It worked for a while — until it didn’t.
“The teachers found out and I got into so much trouble,” says Ryan, now 17. He was reported to the exam board and disqualified from the entire GCSE. “I got detention, wasn’t allowed to go to the leavers’ prom, and my mum and dad were so mad with me. Once I had got away with it the first time, it seemed like I wasn’t going to get caught.”
For the first time Ofqual, the body that regulates qualifications, examinations and assessments in England, has included a category for plagiarism using AI in its annual malpractice report, which is due to be published later this month. These figures will also detail what sanctions students have faced as a result of malpractice.
• School league tables 2025: the best UK secondary schools revealed
The use of AI among students has been dubbed “the homework apocalypse” by Ethan Mollick, an academic at the Wharton School of the University of Pennsylvania and a leading voice on AI in education. One in five UK secondary school pupils use AI for homework, rising to 31 per cent for those on free school meals, according to the annual parent survey by the charity Parentkind. Anecdotally, however, the use of AI among children appears to be far higher — and while many kids use it perfectly legitimately for research and revision, some go much further.
One dad says his 15-year-old daughter had a WhatsApp group with her friends in which they were all sharing homework answers. “And my other daughter, who is 13, used Google Lens to take pictures of maths problems and get instant solutions,” he says. “I was angry and talked to her about it, but part of me sort of understands. I use ChatGPT for work, to help summarise documents, and it must be so tempting.”
Private tutors have seen children as young as nine using AI to do their work for them. “One child admitted to using it to write an essay titled The Storm,” says Chris Pearse, the managing director of Teachitright, which offers tuition for 11-plus and Common Entrance exams. “They said it saved so much time as they had such a busy week with school and extracurricular activities.”
Naturally Mumsnet users have plenty to say on the subject. One writes that their son used ChatGPT to complete his A-level coursework. “The school reported it, the exam board checked it and he was disqualified from the whole A-level.” Another Mumsnetter describes how her daughter was given an essay to write on reincarnation as part of religious studies homework: “All the class basically used AI to write the homework, leading to presumably a teacher spending hours marking the results of an algorithm.”
Yet, as Ryan discovered, teachers can mostly spot AI work; if in doubt, many will run suspicious essays through AI checking tools such as Turnitin. “According to my son’s history teacher, perfectly placed semicolons are the best way to spot AI-generated schoolwork,” one dad says. AI can also often be wrong — University of Pennsylvania researchers conducted a study that found high-school students who used AI programs such as ChatGPT for maths practice performed worse on tests than their peers who practised without AI assistance.
The lingo you need to know
Laura Gowers, a teacher and the founder of This Is Dyslexia, an assessment and support service, regularly sees homework that has been completed using ChatGPT. “One student had a whole section of text in the middle of an essay question that was clearly written using AI,” she says. “It sounded robotic and not like the student’s usual work or in the tone of the rest of the essay.” It also, she adds, included a number of words often used in AI responses, such as “plethora”, “leverage” and “myriad”, none of which tend to trip off a typical 14-year-old’s tongue.
While relying too heavily on AI may inhibit critical thinking and discourage genuine learning — and in the worst cases help children to be outright dishonest — it can have an important place in helping children to learn when used correctly, says Dr Shweta Singh, an AI expert at Warwick Business School. “These tools offer students immediate access to vast amounts of information and can enhance their understanding of complex topics,” she says. The Joint Council of Qualifications (JCQ), which represents exam boards, has issued a holding document saying AI can be used for coursework, but must be referenced. Malpractice can result in severe penalties.
AI can be useful for neurodivergent and dyslexic children and those with special educational needs, while some independent schools have even devised their own AI tools. Caterham School in Surrey has devised the RileyBot, an AI chatbot designed specifically for schools — it’s suitable for children from the age of four and has strict safeguarding protections. Sherborne School, the private boys’ school in north Dorset, has just created its own version, the Sherbot.
For parents who are worried about how to navigate all this, experts say the best way is to talk honestly with children about AI, about where it can be helpful for research but how it should be treated with scepticism and shouldn’t be used to cheat. “It can be helpful for the parent to model how they use AI in their work so that their child has a concrete example of good practice in use,” Gowers says.
Ultimately it pays to remember that your child is likely to be much more tech-savvy than you. “I thought I’d made my son’s smartphone into Fort Knox by putting on all manner of parental controls,” one mum says with a sigh. “I saw the first thing he searched for was: ‘How do you disable parental controls on a phone?’”
School league tables 2025
Search for the best secondary schools and get tips for how to choose a good school
Source: Shortcut or cheat: how acceptable is Chat GPT for school work?