Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - AU
The Guardian - AU
Technology
Josh Taylor

Cybersecurity funds should go towards beefing up Centrelink voice authentication, Greens say

The federal government should be using some of the $10bn allocated in the budget to cybersecurity defences to combat people using AI to bypass biometric securities including voice authentication, a Greens senator has said.

On Friday Guardian Australia reported that Centrelink’s voice authentication system can be tricked using a free online AI cloning service and just four minutes of audio of the user’s voice.

After the Guardian Australia journalist Nick Evershed cloned his own voice, he was able to access his account using his cloned voice and his customer reference number.

The voiceprint service, provided by the Microsoft-owned voice software company Nuance, was being used by 3.8 million Centrelink clients at the end of February, and more than 7.1 million people had verified their voice using the same system with the Australian Taxation Office.

Despite being alerted to the vulnerability last week, Services Australia has not indicated it will change its use of voice ID, saying the technology is a “highly secure authentication method” and the agency “continually scans for potential threats and make ongoing enhancements to ensure customer security”.

The Greens senator David Shoebridge said the finding was “deeply troubling” for people who rely on government services and there needed to be a regulatory framework for the collection and use of biometric data.

“The concerns here go beyond the use of AI to trick voiceprint,” he said. “There are few, if any, protections on the collection or use of our biometric data to feed and train corporate AI systems.

“We can’t rely on a whack-a-mole approach to digital security where issues are only dealt with once they embarrass the federal government.”

Shoebridge said the $10bn in funding in last year’s budget for the Australian Signals Directorate’s Redspice cyber defence program should include an investment to ensure that threats such as the misuse of AI could be identified and protected against on a whole-of-government basis.

Guardian Australia has sought comment from the deputy prime minister and defence minister, Richard Marles, who is responsible for the ASD.

Shoebridge said the government should also audit agencies using voice recognition to ensure any further security flaws were identified and fixed.

“The government’s main objective with the use of such technologies is to cut operating costs as opposed to what is best for the millions of Australians who rely on government agencies and services,” he said. “These government savings are almost always paid for by Centrelink’s clients.”

In the 2021-22 financial year Services Australia reported that it used voice biometrics to authenticate 56,000 calls each workday, and to authenticate more than 39% of all calls to Centrelink’s main business lines.

Between August 2021 and June 2022 it was used in 11.4% of all child support calls, or more than 450 each working day.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.