![](https://www.newcastleherald.com.au/images/transform/v1/crop/frm/e5Qc2M5qQnfX3PTaVNk9Vy/401c0a14-b682-49e6-8875-b6cd4cd361bc.jpg/r99_52_1017_678_w1200_h677_fmax.jpg)
Have you heard of ChatGPT? It has been hard to avoid it in recent days.
The artificial intelligence (AI) chatbot has been hailed as a technological breakthrough for its highly cohesive, human-like responses to questions and prompts.
It was only launched in November 2022, but it has already become a household name.
While the world has been talking about natural language processing and how it can be used to create content for years, ChatGPT is the first that is truly practical.
And as humans, we're now testing its capacity and feeling our way around what it means.
The AI can write an essay or a novel based on a short prompt.
It can be used to formulate responses for customer service centres, create marketing material and even write press releases.
It could write this editorial if we wanted it to, and it definitely could write news stories.
Some argue this is the end of the writing.
And the fear isn't unfounded, with universities banning their students from using it, and some have even resorted to pen and paper exams.
Last month, it was revealed that students in NSW state schools would not be able to access artificial intelligence applications like ChatGPT while at school amid concerns it would help students cheat on assessments.
NSW is the first Australian state or territory to restrict access to the application on student devices or while students are using their personal device on the school network.
There's no doubt ChatGPT is a disrupter on the same scale as steam power, mass production and the digital revolution.
Over the coming months, we'll see how businesses use the tool for growth and build it into products.
There are already news organisations worldwide which use some form of AI to write content, including the Associated Press, which uses AI for corporate earnings stories.
But there is one practical problem with ChatGPT - its knowledge is static.
The AI doesn't access new information in real-time, which means right now, it's stuck in November 2022, and from time to time, the AI will "make up facts" in a phenomenon known as "hallucinating".
Of course, developers will likely find a solution to this, but right now - and probably always - there will be a place for quality control, credibility and fact-checking.
-
EDITOR'S NOTE: This was not written by ChatGPT.
ISSUE: 39,820
![](https://www.newcastleherald.com.au/images/transform/v1/crop/frm/3ArTPYWJ7uTzcYp6Sg47gg6/78e8ec75-8c63-44fc-91bf-6662c58d661c.png/r99_0_1280_617_w1200_h674_fmax.jpg)
To see more stories and read today's paper download the Newcastle Herald news app here.