Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - AU
The Guardian - AU
Technology
Caitlin Cassidy

ChatGPT has become the ‘best teammate’ to these Sydney university students – but is there a limit?

Abigail Bobkowski, Yihong Yuan and Jack Quinlan, students at the University of Sydney who are using generative AI.
Abigail Bobkowski (left), Yihong Yuan and Jack Quinlan: students at the University of Sydney who are using generative AI. Photograph: Mike Bowers/The Guardian

Third-year student Jack Quinlan was confident he knew what I was going to ask before we conducted our interview. He wasn’t psychic, and I hadn’t fed him questions – he’d just done a trial run on ChatGPT.

Prior to our meeting, the software engineering and neuroscience undergraduate logged on to the program to generate the kinds of questions a “professional journalist at the Guardian” would ask a student about artificial intelligence at universities.

“What prompted your university to begin using generative AI tools in education?” the software version of me began. “How have students and educators at your university responded to the introduction of generative AI? Have there been any challenges and concerns raised?”

Quinlan gave detailed answers and the bot responded in turn, offering encouraging comments like: “That’s fascinating!”

ChatGPT caused massive upheaval across the globe when it launched at the end of 2022, prompting Australian universities to change the way they ran assessments and temporarily return to pen and paper exams.

Since then, fears students would use emerging software to write essays and complete assignments have come to fruition, with academics citing rising levels of plagiarism which can evade unreliable detection software.

But as universities wrangle to implement new academic integrity policies and crack down on cheating, artificial intelligence is also being reframed as a potentially revolutionary teaching and learning tool – one that, if left unadopted, may see a generation of students fall behind.

The ethical use of AI

When ChatGPT first launched, the University of Sydney was on the back foot, the instiution admits. Now it says it places digital technology at the forefront of its curriculums.

It’s been a similar shift across the university sector, which has broadly pivoted to acknowledge generative AI in academic integrity policies and incorporate the new technology into learning and teaching. At the University of Melbourne, for instance, artificial intelligence is even being used to help mark assessments.

This year, the University of Sydney was named AI university of the year at the inaugural Future Campus awards, a body established last year to bring news and analysis on the higher education sector. The award was in recognition of its development of an AI assistant tool which answers student’s questions about course contents and syllabuses across more than 300 units of study – 24 hours a day.

The tool operates alongside an “AI in Education” guide, developed by students and staff in recent months to illustrate how to responsibly use AI for learning – without crossing into the territory of plagiarism.

The university’s pro vice-chancellor of educational innovation, Prof Adam Bridgeman, says the guide forms part of the institution’s broader work to reshape assessments incorporating – rather than banning – the use of AI.

While it will require a lot of academic upskilling, Bridgeman says nobody can outrun new technologies, particularly at the speed they are advancing. Instead, he says students and staff must be taught to use generative AI ethically in expectation that, within a decade, it will be cited on resumes as commonly as proficiency in Microsoft.

“The truth is that many students know more about AI than most of us,” Bridgeman says. “What we don’t want is an environment where students use AI but don’t tell us about it.

“If AI makes it easier for students to get through their degrees, we need to rethink the nature of assessment and create new challenges that prepare them for a future where this technology is an intrinsic part of life and work.”

An efficiency tool

Yihong Yuan, a PhD student in the school of computer sciences, helped create the guide. She says AI is the “best teammate you can have” while studying.

Rather than copying a question and asking generative AI for an answer, the guide encourages students to use it for writing prompts, proofreading, assessment feedback or brainstorming. With the input of a marking rubric and assessment, generative AI can give feedback in seconds, where tutors may take a week or simply be unable to offer individual feedback to each student due to time constraints and class sizes.

“It’s a great efficiency tool,” Yuan says. “With research, it can’t create knowledge; it’s more like peer support whenever you need it. If I want to ask a question at 3am, I can’t ask my tutor, but that’s when it comes in handy.

“It’s providing me academic help, but also a kind of mental health cuddle as well, because some parts of education feels like a lonely journey.”

Quinlan is allowed to use generative AI in all his engineering subjects – and nearly all assignments. Before lectures and tutorials, his friends plug his subject’s learning outcomes into generative AI to be summarised – particularly helpful if they haven’t done the readings.

By contrast, Abigail Bobkowski – a fourth-year student doing a dual bachelor’s of arts and advanced studies – says when she began studying, her faculty had been “super anti-AI”, implementing a blanket ban she felt was “cutting off” what students could do. She says arts subjects are still holding out from being full generative AI adopters, particularly in the classroom.

“It’s been really frustrating … from what I can do and how much it can help,” she says. “This is the first semester where I’ve had a course through the faculty of arts and social sciences where they’ve said you can use it.”

Ghost in the machine

This semester, Bobkowski is using genAI to create a virtual reality headset that will allow students to explore rare books. Quinlan is adopting it to help develop a headband that reinforces deep sleep states so people wake up well rested. And in computer science, Yuan is analysing artificial intelligence to determine how it can be used in education.

“It’s a mirror,” Bobkowski says. “It just shows us what we put into it.”

But there are still limits to generative AI. It isn’t at the stage of replacing a human’s creativity – it isn’t James Joyce. It’s much better at summating content into key points, drawing out the main themes into a clean, neat sequence.

Midway through one of my interviews, I instinctively start to respond “that’s fascinating” to one of the students’ answers. I feel, with a sense of dread, that I am completely replaceable, that there is no place at which the AI ends and I begin.

But no – that can’t be right. Our questions are tangentially similar, but bot-me lacks a certain degree of nuance, of humour. Bot-me could not write this article, I tell myself.

When I ask ChatGPT to turn the interview’s transcription (also generated by AI) into a news story, this is the lead it comes back with: “In an engaging discussion at the University of Sydney, students shared their perspectives on how generative AI is reshaping the educational landscape.”

Terrible.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.