Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Crikey
Crikey
Cam Wilson

‘World’s most controversial company’ Clearview AI still being used to solve Australian police cases

Clearview AI is being used to solve Australian police cases despite the privacy watchdog slamming police for their use of the controversial and unlawful facial recognition technology.

After publicly cutting ties with the company and denying that any third parties are using the technology on their behalf, the Australian Federal Police (AFP) has now confirmed to Crikey that it has provided case material to an international law enforcement agency which was later analysed using Clearview AI’s technology.

Internal correspondence obtained by Crikey shows that police monitored the issue, discussed the risk to their agency if the use of Clearview AI’s technology gained the attention of the Australian media, and planned how to spin its use in a positive light.

Australian police’s relationship with the ‘world’s most controversial company’

Clearview AI is a US tech company co-founded by Australian Hoan Ton-That. It was dubbed the “world’s most controversial company” after a 2020 New York Times article revealed that it had built a facial recognition app based on a database of billions of illegally obtained images scraped from the internet. 

The company claimed that anyone with a smartphone could use a photograph to instantly find out someone’s identity as well as other information about them. Hundreds of law enforcement agencies were given access to the software — including the AFP and Queensland Police Force (QPS) — as well as other private customers.

Clearview AI and its users soon earned the attention of regulators around the world. Among the many jurisdictions in which it faced penalties or civil lawsuits, Australia’s information and privacy commissioner Angelene Falk found in 2021 that Clearview AI had broken the law through acts including the unauthorised gathering of Australians’ data. She delivered a separate finding about the AFP’s use of the technology, criticising them for breaching Australians’ privacy through its use.

While the AFP no longer directly uses Clearview AI, Crikey can now reveal that the AFP’s international law enforcement counterparts have used the company’s facial recognition technology on the case material that the Australian agency provided to Interpol on at least one occasion last year. 

Operation Renewed Hope

Between July 17 and August 4 last year, the United States’ Homeland Security Investigations (HSI) ran a global victim identification task force called Operation Renewed Hope that was tasked with reviewing unsolved cases of child sexual exploitation. The task force brought together HSI staff and staff from other US agencies, intergovernmental police bodies like Interpol as well as representatives from more than a dozen other countries’ law enforcement agencies including the AFP and the QPS. 

A briefing by QPS Assistant Commissioner Katherine Innes to the state’s police minister in June obtained through a Queensland right to information request shows that agencies were informed ahead of time that it would use “new and emerging technologies” on old cases. A June executive briefing prepared by a detective assistant sergeant in the AFP’s victim identification team obtained through a federal freedom of information (FOI) request similarly states that participation in the operation would provide the AFP an “invaluable opportunity for information sharing” about combating child sexual exploitation.

Three days after the task force concluded, Forbes first reported about the operation’s existence — and its use of Clearview AI’s facial technology product Clearview ID: “Sources told Forbes that Clearview and other AI tools were used to scan huge caches of child exploitation material captured by HSI as well as Interpol’s Child Sexual Exploitation (ICSE) database, which contains more than 4.3 million images and videos of abuse,” wrote reporter Thomas Brewster.

Operation Renewed Hope resulted in 311 referrals for victims identified from the ICSE images. This included the identification of one Australian victim who, according to an ABC interview with QPS victim ID analyst and Operation Renewed Hope attendee Scott Anderson, had already gone to police for previous disclosures. 

The AFP has contributed case material to the ICSE database in the past. In late 2022, an AFP and Interpol joint announcement about the Australian agency providing $815,000 in funding for the ICSE stated that 860 victims had been identified and 349 offenders had been arrested in Australia as a result of the collaboration. The press release said that the funding would go towards improving the database “through integration of the latest technologies … [including] facial recognition”. 

An AFP spokesperson told Crikey that the force does “not use Clearview AI or use any other agency to do so on our behalf” and that AFP staff who attend task forces only use approved tools. This echoes a statement made by Deputy Commissioner International and Specialist Capabilities Command Lesa Gale during Senate estimates in May last year. 

But the spokesperson also confirmed that it provides material to Interpol’s ICSE which formed the basis of material analysed in Operation Renewed Hope. This then resulted in further developments in at least one Australian child sexual exploitation case. In short, the AFP shared Australian case material to a third party which was analysed using Clearview AI technology that the agency wasn’t allowed to use.

When asked about when the AFP became aware that Operation Renewed Hope — which it was a part of — was using Clearview AI, a spokesperson didn’t answer directly but gave Crikey a general statement that didn’t seem to answer our question about whether the agency believes it should have access to Clearview AI’s technology.

“The AFP is committed to using any online tools appropriately, and to carefully balance the privacy and potential sensitivity of data in an online environment with the important role this information can play in [the] investigation of serious crimes, including terrorism and child exploitation,” they said. 

‘Given the use of Clearview ID… there are risks’

Behind the scenes, emails show, Australian police carefully watched the reaction to the use of Clearview AI in Operation Renewed Hope. On August 9, an Australian Centre to Counter Child Exploitation (ACCCE) detective sergeant emailed one of the centre’s leadership team, a detective superintendent, about the Forbes article with the subject line “Article re HSI taskforce and use of facial recognition tools on ICSE data”. “This article does not reference any international involvement, however does reference the use of facial recognition tools, and that the material was provided from the Interpol ICSE database,” they wrote. “[Redacted] brought it to me this morning, noting it would be relevant to brief up so that we can prepare responses should we be asked for comment.” 

The detective superintendent thanks them and says they will bring it up with their superior: “We can step through any questions he might have”, they replied. 

Around the same time, QPS was also considering its strategy. In an August 10 email addressed to “Chief”, a staff member sends a briefing note in response to a request from 7 News to do an interview about the operation. “Given the use of Clearview ID by the HSI, there are risks in us doing an interview.” The briefing note mentions the privacy commissioner’s Clearview AI decision. In a section titled “Media Issues”, it says that QPS staff will not mention the use of Clearview AI’s technology and says that the agency’s media team is “assisting with appropriate messaging around this technology to ensure any comments are brand agnostic and that the mention is made of the potential privacy and human rights implications that the technology can bring.”  

Since Operation Renewed Hope, Australian police have stepped up their advocacy for the use of Clearview AI. A September article in The Australian reported comments by the AFP’s Assistant Commissioner Hilda Sirec and Deputy Commissioner Lesa Gale — who previously had denied the AFP used Clearview AI through a third party in Senate estimates — in favour of granting access to tools like Clearview AI based on the success of Operation Renewed Hope. 

It also featured an enthusiastic endorsement from former AFP ACCCE Operations Manager Jon Rouse, who made appearances in the Herald Sun, 7 News and the ABC’s coverage of Operation Renewed Hope, making the argument that an effective ban on Clearview AI meant that police were “working with one arm tied behind their backs”.

“The debate over protecting privacy and whether we should be allowed to scrape data is irrelevant to me,” Rouse told the Herald Sun.

(Last September, Crikey reported that Rouse, while still working for the AFP, had met with Hon-That, emailed back and forth and arranged for him to brief a meeting of Australian and New Zealand child protection unit leaders to get “some education on the Clearview Issue”, all after the privacy commissioner’s adverse findings against Clearview AI and the AFP.)

Australian Greens digital rights spokesperson Senator David Shoebridge said he’s been concerned about Australian police “essentially contracting out their use of Clearview AI” for a while.

“What is especially troubling is the suggestion that victims’ images have been uploaded to the Clearview database without consent and without any effective privacy checks,” he said in an emailed statement. “Analysis of Clearview has shown that it has a very real racial bias, if for no other reason, this should prevent the AFP from participating in it.”

Shoebridge said this shows that Australia’s privacy legislation is inadequate.

“If the federal police can routinely flout privacy protections, even when they have been called out by the Privacy watchdog, then surely that’s proof of the need for urgent reforms and far clearer protections,” he said.

The Attorney-General’s office did not answer questions about whether the AFP using a technology that breached Australia’s privacy laws through a third party was meeting its legal obligations and community expectations.

University of Melbourne senior lecturer Dr Jake Goldenfein, who has researched facial recognition technology and surveillance, described the use of Clearview AI on AFP material by a third party as an example of law enforcement “arbitraging jurisdictions”.

He said it’s remarkable that Clearview AI has emerged as the most controversial and objectionable facial recognition technology company but, despite that, law enforcement agencies around the world still want to use its product.

“There’s this big regulatory effort against Clearview AI,” he said over the phone. “The administrative state is ostensibly against this company, but it’s not going to disappear because another part of the state is using it.”

UNSW criminology professor and child sexual exploitation expert Dr Michael Salter said it’s understandable that police would want access to technology like Clearview AI to investigate the most serious crimes that are often aided with new technologies.

“The tech industry is delivering powerful tools that can be used to abuse kids but there isn’t the same commercial imperative to create tools that can protect them,” he said on a call. 

“Like before, we’re seeing the same dynamics where we don’t get the tools to catch criminals or give relief to the victims, while criminals can use it for their own means.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.