Schools have been told to make children do some of their coursework “in class under direct supervision” amid fears artificial intelligence (AI) systems can be used to cheat. The Joint Council for Qualifications (JCQ), which represents the UK’s major exam boards, has published guidance on “protecting the integrity of qualifications” in the context of AI use.
Schools should make pupils aware of the risks of using AI and the possible consequences of using it “inappropriately” in assessment, the guidance says. It adds: “Students who misuse AI such that the work they submit for assessment is not their own will have committed malpractice, in accordance with JCQ regulations, and may attract severe sanctions.”
ChatGPT is a form of generative AI which has come to prominence in recent months after a version was released to the public last year. It can respond to questions in a human-like manner and understand the context of follow-up queries, much like in human conversations, as well as being able to compose essays if asked.
READ MORE: Boy, 16, arrested after man brutally attacked following row on Manchester train
In February, a graduate of the University of Bristol tested ChatGPT by asking it to write a 2,000-word essay on social policy. The AI produced the work in 20 minutes and the former student was told it would have passed with a score of 53 – a 2:2.
The guidance for teachers and assessors – which highlights a number of AI chatbots including ChatGPT – suggests “allocating time for sufficient portions of work to be done in class under direct supervision to allow the teacher to authenticate each student’s whole work with confidence”.
It comes as MPs on the Commons Science and Technology Committee were warned of the risks that ChatGPT poses to the assessment system. Daisy Christodoulou, director of education at No More Marking, told MPs on Wednesday: “I do think that ChatGPT has huge implications for continuous assessment and coursework. I think it’s very hard to see how that continues.”
Discover, learn, grow. We are Curiously. Follow us on TikTok, Instagram, Facebook and Twitter.
She added: “It is capable of producing original, very hard to detect, relatively high-quality responses to any kind of question. They won’t be perfect, but they’ll be good enough for a lot of students.
“I think that means that uncontrolled assessments – where you’re not sure how the student has produced that work – become very, very problematic. I do think we have to be thinking that there is a value in having those traditional exams.”
Ofqual’s chief regulator Jo Saxton said ChatGPT has made traditional examined conditions “more important than ever”. Speaking to headteachers at a conference earlier this month, she said pupils could be asked to complete their coursework and essays under exam conditions in schools following the rise in the use of AI technology.
Christodoulou, who used to be head of assessment at Ark Schools, added that she heard of teachers using ChatGPT to create questions for students at the end of lessons. She warned MPs: “This is where it gets more problematic.
“The more open you make it and the more you give it more latitude, it does make a lot of errors, it makes an awful lot of errors.”
On ChatGPT, Christodoulou said: “It’s very good at basically cheating, giving students essays that are not their own. It is very good at that and it can do that at scale and rapidly.
“For the more socially useful things in education, such as creating questions and marking, it’s not as good as we’d hoped.”
It comes as Education Secretary Gillian Keegan has said that AI has the “power to transform” teachers’ day-to-day work. Speaking at the Bett show in London on Wednesday, she suggested the use of AI in education could significantly reduce tasks that “drain teachers’ time,” such as lesson planning and marking.
A document setting out the Department for Education’s (DfE) position on the use of generative AI – including ChatGPT or Google Bard – in the education sector calls on schools, colleges and universities to continue “to take reasonable steps where applicable to prevent malpractice”. It adds: “Schools and colleges may wish to review homework policies, to consider the approach to homework and other forms of unsupervised study as necessary to account for the availability of generative AI.”
READ NEXT: