Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Reason
Reason
Joe Lancaster

U.K. Demands Access to Any Apple User's Data, Anywhere in the World

The United Kingdom's Home Office is reportedly demanding that Apple, one of the world's largest tech companies, provide law enforcement access to its users' private data, not just in Britain but around the world.

European regulators are no strangers to making demands that the rest of the world has to live with. In 2022, the European Parliament mandated USB-C connections as the standard for data transfer, causing Apple to abandon its proprietary Lightning ports worldwide.

But the U.K.'s demand for access to Apple user data poses a grave threat to global digital security.

"Security officials in the United Kingdom have demanded that Apple create a back door allowing them to retrieve all the content any Apple user worldwide has uploaded to the cloud," The Washington Post reported earlier this month. "The British government's undisclosed order, issued last month, requires blanket capability to view fully encrypted material, not merely assistance in cracking a specific account, and has no known precedent in major democracies."

Neither Apple nor the government have confirmed the order, though the Post notes that the request was made "under the sweeping U.K. Investigatory Powers Act of 2016, which authorizes law enforcement to compel assistance from companies when needed to collect evidence" and "makes it a criminal offense to reveal that the government has even made such a demand."

"These reported actions seriously threaten the privacy and security of both the American people and the U.S. government," Sen. Ron Wyden (D–Ore.) and Rep. Andy Biggs (R–Ariz.) wrote last week in a letter to newly minted Director of National Intelligence Tulsi Gabbard. "If Apple is forced to build a backdoor in its products, that backdoor will end up in Americans' phones, tablets, and computers, undermining the security of Americans' data, as well as of the countless federal, state and local government agencies that entrust sensitive data to Apple products."

Wyden and Biggs asked that Gabbard "giv[e] the U.K. an ultimatum: back down from this dangerous attack on U.S. cybersecurity, or face serious consequences." Wyden also released a draft of a bill designed to close loopholes in U.S. law that could allow foreign governments to make such demands of American companies.

Apple devices use 256-bit encryption, which is currently the most secure option available and is considered virtually unbreakable. Data is also end-to-end encrypted, meaning it can't be accessed—even by Apple—without a user logging into their device.

That level of security is comforting to users but vexing to law enforcement: Last year, police in Detroit complained that a new software update caused inactive iPhones to periodically reboot, making it more difficult for cops to unlock and access phones collected as evidence.

The feds are also particularly hostile to Apple's position of maximal user security. Multiple times in recent years, the FBI demanded access to locked devices to aid an investigation; each time, Apple refused. In one such case, law enforcement collected an iPhone owned by one of the perpetrators of the 2015 mass shooting in San Bernardino, California. The phone only allowed a certain number of incorrect passwords before it would lock out any further attempts, so the bureau sought a court order requiring Apple to unlock the phone.

"Because Apple has no way to access the encrypted data on the seized iPhone, the FBI applied for an order requiring Apple to create a custom operating system that would disable key security features on the iPhone," the Electronic Privacy Information Center wrote. Apple CEO Tim Cook said the FBI was asking the company "to write a piece of software that we view as sort of the equivalent of cancer."

A federal judge issued the order, which Apple contested, but the FBI later withdrew the request and said it was able to retrieve the information another way.

Still, calls persist for tech companies to develop a "backdoor" for encryption that would allow access to law enforcement but no one else.

"Technology has become a tool of choice for some very dangerous people," then-FBI Director James Comey said in 2014. He elaborated in 2017 that privacy and security stem from the "bargain" that "there is no place in America outside of judicial reach," and "widespread default encryption…shatters the bargain."

In 2015 Senate testimony, Comey said someone should create a way for law enforcement to circumvent encryption without compromising user security. "A whole lot of good people have said: It's too hard; that we can't have any diminution in strong encryption to accomplish public safety, else it'll all fall down and there'll be a disaster. And maybe that's so. But my reaction to that is, I'm not sure that we've really tried."

Speaking at the 2018 Aspen Security Forum, then-FBI Director Christopher Wray echoed Comey's call for some sort of encryption master key: "There's a way to do this. We're a country that has unbelievable innovation. We put a man on the moon….And so the idea that we can't solve this problem as a society—I just don't buy it."

If that sounds like too simple of an explanation, there's a reason.

"One of the fallacies I think of this debate is that it's often framed as, we've solved so many hard technical problems….Surely if we can put a man on the moon, we can design a secure backdoor encryption system," computer science professor Matt Blaze said in 2015. "Unfortunately, it's not so simple. When I hear, 'If we can put a man on the moon, we can do this,' I'm hearing an analogy almost as if we're saying, 'If we can put a man on the moon, surely we can put a man on the sun.'"

U.S. and U.K. officials want someone to build an unbreakable lock and give law enforcement a key. This would be alarming enough about an actual lock, but a backdoor to access digital encryption would render it inherently vulnerable.

In effect, there is no way to introduce an intentional flaw to data encryption that cannot be exploited by bad actors. A group of computer scientists studying government requests wrote in a 2015 article in the Journal of Cybersecurity that "such access will open doors through which criminals and malicious nation-states can attack the very individuals law enforcement seeks to defend."

If the U.K.'s order stands, Apple will either have to weaken every user's security worldwide or cease operating in Europe altogether. Either move would be disastrous, either to the company's bottom line or to its users' data privacy. Since some U.S. officials seem to agree with the U.K.'s motivation, it's clear that governments pose a much graver threat to privacy than corporations.

The post U.K. Demands Access to Any Apple User's Data, Anywhere in the World appeared first on Reason.com.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.