Teachers should use ChatGPT and generative AI to develop critical literacy, partly to make students realise that algorithmic decision making isn’t necessarily objective
Opinion: Schools have been grappling with how to deal with ChatGPT since the AI technology burst onto the world scene late last year, with many schools banning its use. But the tide is turning, as our educators acknowledge that understanding ChatGPT, and other forms of AI, is a crucial part of education.
Things have changed quickly since the beginning of 2023, when US school districts in New York City, Los Angeles and Baltimore implemented bans on ChatGPT use. These schools cited the lack of a clear, effective strategy on how it could be used to support learners, potential negative impacts on student learning, concerns regarding the safety and accuracy of content, and that it could be used by students to cheat.
READ MORE:
* Fear of ChatGPT in schools and universities is misplaced anxiety
* AI chatbots will revolutionise education
* AI's new frontier: Works of art and human-like chatbots
These entirely valid concerns were mirrored in New Zealand, where the Ministry of Education outlined how AI might reproduce inequalities and be used as a tool for plagiarism.
But evidently schools and education ministries are recognising that AI tools should be used in education, and are developing frameworks on how it could be.
In October, for instance, it was announced that ChatGPT would be allowed in all schools in Australia, with education ministers formally backing a national framework to support teachers and schools to integrate it into teaching practice.
ChatGPT can synthesise large amounts of data, but it still takes human understanding of the real world to analyse the value of what it is producing
New Zealand is following suit, with the Ministry of Education now working to develop policies and advice for schools and teachers to ensure the education system is capable of responding to improvements in AI. It has yet to announce its policies on generative AI, but so far, its focus has been mainly on how not to use generative AI.
As a teacher and a researcher exploring the influence of digital technologies, spaces and narratives on learners, I would argue that we need to help teachers use ChatGPT in the classroom.
What do teachers have to teach pupils about ChatGPT? Offering an understanding of AI, how AI systems operate, even on a basic level, and how these systems shape the world we live in, would be a step in the right direction.
ChatGPT could also be used as an ‘object-to-think-with’, a cultural object that are part of learners’ social environment that can be used to examine social issues.
Using ChatGPT in this way would enable teachers to demonstrate that AI is a useful tool that can be used in certain situations, but that it is not neutral and students need critical, reflective thinking to use it appropriately and ethically. They also need to learn how to assess the variety of perspectives presented through ChatGPT, the extent to which those perspectives are true, and to be alert to bias in AI systems.
For example, in my classes, I have prompted ChatGPT to summarise the significance of issues relating to youth in Aotearoa New Zealand and then asked students to search for evidence that either supports or contradicts the claims made by ChatGPT.
This could even be conducted at primary school level. ChatGPT can be used to explore the history of cities, landmarks and geographical sites, and teachers can then guide students to examine whose voices have been included and excluded in those histories.
Teachers can therefore use ChatGPT to develop critical literacy. It will help them learn that algorithmic decision making isn’t always objective, that the data collected by AI doesn’t necessarily reflect the average or most probable conclusion.
For example, during the pandemic, algorithms were used to determine students’ A-level and GCSE grades in the UK. Though this may have seemed reasonable, analysis showed that the algorithm used socio-historic school data to determine students’ scores, resulting in students from private schools and affluent areas receiving more favourable scores at the expense of high achievers from disadvantaged backgrounds. Similar problems have been found with AI decision making on things like mortgage applications and recruitment.
Children (and quite possibly a few adults) need to learn that the world contains patterns of injustice and these are reflected in the data AI is trained on. It does single and simple tasks at an impressive rate but its results don’t yet reflect the range of cultural, social, historical or economic factors that inform the real, more complex, world.
ChatGPT can synthesise large amounts of data, but it still takes human understanding of the real world to analyse the value of what it is producing. Rather than looking at an issue through one discipline, teachers can use ChatGPT to encourage students to consider a range of different perspectives and agendas that are, or aren’t, embedded in the data used by AI.
This is not just a matter of understanding how to use generative AI, but also understanding just how deeply embedded AI is in the networks we use to communicate, to shop, for entertainment and, most notably, to gather information.
Ultimately, for students’ education to prepare them to thrive in a complex world, their education needs to be rooted in their real experiences. Teachers can use ChatGPT to explore a variety of different people’s realities, to consider a range of perspectives and experiences before drawing conclusions.
While the Ministry of Education works on developing policies and guidelines for teachers and schools, ChatGPT can be critically embraced in schools today.
Teachers need to be given the support, space and flexibility to embrace AI in their teaching. Many teachers already are, but to support more widespread adoption of ChatGPT in schools, it’s vital the policies and guidelines don’t just advise teachers how not to use ChatGPT, but how it can be used. AI is not some passing fad, so need to get it to work for us, not against us.