Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Alex Hern

Does what happens on your iPhone still stay on your iPhone?

The iPhone 15 Pro is shown after its introduction on the Apple campus, September 2023.
The iPhone 15 Pro is shown after its introduction on the Apple campus, September 2023. Photograph: Jeff Chiu/AP

AI is power-hungry, and that’s causing problems for Apple.

We’re still working through the ramifications of the company’s worldwide developers conference, where it revealed how it intends to incorporate AI into your daily life – but only, for the most part, if your daily life involves a brand new iPhone:

Apple’s new AI models will run on the iPhone 15 Pro and Pro Max, the only two devices the company has yet shipped with its A17 processor. Macs up to three years old will also be able to take advantage of the upgrade, provided they have a M1, 2 or 3 chip, and so too will iPad Pros with the same internal hardware.

The cheaper iPhone 15 models run the A16 Bionic, a chip that debuted in 2022. They also have 6GB of memory, compared with the 8GB included on their more expensive Pro siblings, which may be the pertinent difference, since the M1 chips which can run AI models on the Mac are equivalent to 2020’s A14 iPhone processors.

A lot of model numbers to hammer home the point that AI features won’t just run on any old phone. But lots of the most advanced AI models won’t run on any phone – or at least, not at a speed that users would find acceptable. If Apple wants to offer AI tech, it has to do it with a datacentre. And that poses difficulties. Kari Paul writes:

At the core of Apple’s privacy assurances regarding AI is its new Private Cloud Compute technology. Apple seeks to do most computer processing to run Apple Intelligence features on devices. But for functions that require more processing than the device can handle, the company will outsource processing to the cloud while “protecting user data”, Apple executives said on Monday.

To accomplish this, Apple will only export data required to fulfil each request, create additional security measure around the data at each end point, and not store data indefinitely. Apple will also publish all tools and software related to the private cloud publicly for third-party verification, executives said.

You can’t offer total privacy for AI queries, the way you can for online backups or messaging services, because the whole point is that the server at the other end needs to know what is being asked in order to give the right answer. But that’s a problem for Apple, which has spent years arguing that a crucial distinction between it and corporate rivals like Facebook and Google is that, to quote one multimillion-dollar ad campaign “What happens on your iPhone stays on your iPhone.”

The solution is fairly impressively wrought. Apple will be running its own unique datacentre, on hardware it designed, that is set up to never retain any user data. Apple will release the software running on those servers to security researchers, who will be able to load it up themselves and verify it does what the company says, and who will be given the tools required to check that the software running in the datacentres is identical.

But the question is: does that mean you don’t have to trust Apple? Last time I covered a company going to such lengths to bind itself was Huawei, which launched a “cybersecurity evaluation centre” and partnered with GCHQ for almost a decade to try to clear itself of supposed ties to the Chinese state.

It didn’t work. Huawei was unable to offer the evidence needed to clear its name – and probably never could. If you don’t trust someone, you shouldn’t run their software, and there’s almost nothing they can say topersuade you otherwise. (Even publishing the source code isn’t all that much help).

Apple isn’t Huawei, and for many, the company has earned the trust it’s now seeking to spend. But try as it might, Apple can’t get away from the fact that the rise of AI has forced it to compromise on one of the foundational principles of the iPhone era.

“What happens on your iPhone stays on your iPhone, unless you use some Apple Intelligence features, in which case it may leave your iPhone to go to a server controlled by Apple, which is running software that means that it stays on the server” might be almost as privacy-centric, but it definitely doesn’t fit on a billboard.

Am I ready to switch from smartphone to Light Phone?

I’m perennially fascinated by the products at the fringes of the smartphone world, existing in the few niches that aren’t smothered by Apple and Google. At one end, that covers the AI devices from the likes of Humane and Rabbit – hardware that hasn’t quite lived up to its lofty ambition, suggesting that the market is open because it’s not possible to satisfy it yet.

At the other end is a growing type of product you might call an anti-phone: devices built for people who don’t want a full digital detox, but don’t want to carry around a £1000 distraction box either. Devices like the Light Phone III:

The Light Phone III is built around a user-customisable menu of optional tools. All of the tools are custom-designed for our LightOS to ensure a thoughtful experience.

Available tools currently include: Alarm, Calculator, Calendar, Directory, Directions, Hotspot, Music, Notes/Voice Memo, Podcast or Timer

It boasts large ergonomic metal buttons, including a dedicated two-step camera shutter, half-press to focus, full press to snap a photo.

It’s a fascinating cross-section of features, attempting to square the circle between people’s professed desire to be distraction-free, and their practical need to access the conveniences of the digital world.

Some of the limitations are obvious and deliberate – no web browser, for instance, means that even if you do crack and crave a hit of social media, you can’t just log in to the Instagram website.

Others, though, speak to the difficulty of trying to play in this space as an independent company. The “music” app necessarily plays only local files, since it can’t access streaming services like Spotify and Apple Music without support from the developers. The phone is cut off from encrypted messaging services like Signal and WhatsApp for the same reason.

Every time I consider making the switch to a device like the Light Phone, I tell myself that the demands of my job and family life mean it would be irresponsible to cut myself off like that. Is that just an excuse, though? Do I even want an anti-phone, or do I just want fewer demands on my life?

The wider TechScape

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.