In the past year, schools have been grappling with one of the biggest changes since the internet - the rise of generative artificial intelligence.
The conversations started in Term 4, 2022, when Open AI launched ChatGPT, giving students the ability to draft an essay within seconds.
Since then a host of AI tools have emerged giving students access to more information - and more temptation to cheat.
St Mary MacKillop College student Noa Zisman has been using AI chatbots in her studies.
"I've asked some questions like, give me an essay question. And then I've written essays based on that question, or sometimes I've asked it to find me sources and study tools," Noa said.
Her twin sister Adi hasn't used AI in her schoolwork but could see that it would be a big part of her future.

"People are going to use it more and if students don't use it they may be behind the other people," she said.
Meanwhile Darcy Thripp has done his own investigations comparing different AI platforms and even had a go at creating his own AI tool.
"What it came up with was garbled but it was still pretty cool to make," he said.
"It's a topic that's always evolving. So there's always more to find out and discover and it's probably a career field that I'd like to end up in."
Schools around the world have been discovering the potential risks and powerful benefits that this new technology could bring to education.
Academic Integrity
The first concern schools faced was the temptation for students to use generative AI to create their assessments.
St Mary MacKillop College acting assistant principal Tristan Burg said it forced his school to reflect on the purpose of assessment.
"The challenge was [that] as soon as the task is now outside the classroom, it's essentially insecure. And if the point of assessment is to assess student thinking, and we can't be sure it was the student or the AI doing that thinking, it makes assessment very challenging," he said.
Instead of making every assessment an in-class task, the school developed ways to ensure students' work was their own.
It began with educating teachers, parents and students about AI and being clear on when it can and can't be used.
All written tasks had to be submitted through Turnitin, a software that detects plagiarism and AI-generated text, however this tool is said to be 98 per cent accurate.
"Within a school of 2000 students with multiple assessment tasks, that 2 per cent becomes quite a lot of students who might be false flags," Dr Burg said.
Students also have to complete a validation task in class, usually handwritten or orally, to ensure that they have done the assessment themselves. It could be a reflection on their essay, a quiz or answering a few critical-thinking questions verbally.
"That's probably the most controversial part of it," Dr Burg said.
"Unanimously, across the student and staff body, they all recognise that it was a great way to validate student learning. The only downside was the extra time in class it took to do the validations. But, over the last six months of doing that, we as a teaching body we've all learned and grown."
At Merici College, teachers have been thinking about the kinds of tasks and questions set for take-home assessments.
Deputy principal of teaching and learning Renee Taylor said tasks now often involved a students' personal experience, such as excursions to local places, which makes it harder for AI bots to create a response.
Teachers have checkpoints where they look at a student's work before the final product is submitted.
If a teacher suspects a task was AI-generated, they run it through multiple checkers before having a conversation with the student - and usually it doesn't take very long before they confess.
"Plagiarism is not something new for schools. In every school when there are students who are engaged in that behaviour it's usually always rooted in that a student is actually someone who needs support," Mrs Taylor said.
"Where we have found students have used AI to replicate their work, we have found that it actually comes from a place of struggle as opposed to wanting to deliberately cheat."
AI ethics
The ACT's Board of Senior Secondary Studies (BSSS) has been on the front foot with examining issues around academic integrity and ethics in the age of AI. The board release special interest papers on the topic to guide colleges and had a member sit on the National AI Schools Taskforce.
In early December, Education Minister approved the national framework for generative artificial intelligence in schools, giving a green light for AI to be allowed in the classroom from next year.
The ACT Education Directorate also has an AI working group that released an interim position and is working towards more guidelines for schools.
AI tools have been blocked from all student Chromebooks, but acting executive group manager for service design and delivery Angela Spence said students would be allowed to start using them at school from the end of Term 1, 2024.
"We absolutely see generative AI tools as a real enabler in so many different ways in education, for teachers and for students. But it is really important that we do this in a safe, responsible and ethical way," Ms Spence said.
Education Directorate staff have been allowed to use AI tools but with strict advice not to enter any personal information about students, including their names, ages, locations, assessment work or information on their learning capacity.
Ms Spence said teachers have been experimenting with AI chat bots to draft lesson plans and school leaders have even used them to write graduation speeches.
"We're certainly not putting personal information into AI tools. That's one of the safety measures that we have in place, but generic lesson planning, developing resources, absolutely they're experimenting and ... from a teacher's perspective, it's a real game changer in terms of workload."
Learning tools
With the basic ethical frameworks in place, schools will be increasingly introducing AI into classrooms next year.
Dr Burg said teachers will be encouraged to demonstrate how AI can be used to their year 7 and 8 classes while years 9 to 12 will be using AI in structured tasks using "guardrail prompts".
"We're just starting off with years 10, 11 and 12, they're actually allowed to use AI for some parts of their assessment tasks from next year in ways that we think will benefit the students and also maintain academic integrity," Dr Burg said.
In a successful pilot in 2023, one computer class used AI to draft the content for a website so they could concentrate on the coding of that website.
"We know we need to constantly evolve our practice. So we're constantly trying a little step, see how it goes and then either expanding it out to the school or adjusting as need be."
Mrs Taylor said AI could be used to modify work for students with disabilities, for instance taking a year 11 task and asking it to be simplified to a year three reading level.
ACT public schools will also start introducing AI tools to assist students with different abilities from the end of Term 1, 2024. Some of these tools can translate work into different languages, break down tasks into simpler language or provide instant feedback to students.
"We recently had an international speaker, come and work with us, the expert in transformational digital education, and he shared with us some tools and they popping up every day," Ms Spence said.
"They're all the things that we are exploring at the moment that we will release to our students gradually, once we've ensured that ... safety is in place. It's very exciting."