
Companies that sell artificial intelligence technologies to the federal government should prepare for tougher scrutiny of their products.
That’s according to Taka Ariga, the chief data scientist of the U.S. Government Accountability Office (GAO), the 100-year-old agency that serves as a watchdog for Congress. Among the agency's tasks is to investigate government processes, like technology purchasing, for irregularities or out-right corruption.
The GAO’s latest report on the government's use of facial recognition found that several agencies have failed to properly monitor the use of the technology by their employees and contractors. This is important because privacy advocates worry about government using facing recognition unnecessarily, potentially violating people’s privacy.
The GAO’s report was based on self-reported data, and therefore wasn't comprehensive, as Buzzfeed News noted. Some agencies like the U.S. Probation Office, which supervises people charged with or convicted of federal crimes, told the GAO that it hadn't used facial recognition software sold by the startup Clearview AI during a certain time period. But that contradicted internal Clearview data that Buzzfeed News had previously seen.
Future GAO reports on facial recognition tech and related A.I. software will likely be deeper, Ariga told Fortune. Amid the increased use of machine learning, the agency has developed new methodologies for the “audits of tomorrow,” Ariga said.
They'll take into account how machine learning software must be fed enormous quantities of data to perform correctly. For audits, the GAO will require government vendors to disclose the data they used to train their software and reveal more about how it makes decisions. Because A.I. software is so new, its makers operate under few rules. The GAO hopes to change that by creating standards for how companies report how their technology works and encourage Congress to apply pressure to those companies that fail to measure up.
Cloud computing vendors including Amazon and Microsoft generally keep the inner-working of their A.I. tools secret for competitive reasons. But new GAO auditing requirements may require them to disclose more, Ariga explained.
“People can’t hide behind their IP,” Ariga said, using shorthand for intellectual property. “That won’t work for the federal government.”
Ariga said the GAO has met with unspecified tech vendors to discuss what could happened during an A.I. audit. He said there was “a lot of nodding heads,” implying that company representatives understand the GAO's new auditing guidelines and have yet to push back. He said that several representatives said “this will be an interesting few years.”
“Frankly, we don’t know how the industry will react,” Ariga said. But he added that vendors like Clearview AI, which gained notoriety for creating a massive database of people's faces scraped from the Internet, “absolutely should” expect tougher reviews in the near future.
“We don’t want to be playing catch up,” he said.
P.S. Fortune wants to hear from you about who should be on this year's 40 under 40 list. Check out past lists here to get a feel of who we are looking for. Please do not email us about the list, but rather fill out the form below.
40 Under 40 nominees
Jonathan Vanian
@JonathanVanian
jonathan.vanian@fortune.com