On May 20, 2024, I watched live as Microsoft's Pavan Davuluri announced one of the most controversial features in the company's history, possibly even tech history at large.
Davuluri, Microsoft's Corporate Vice President of Windows + Devices, kicked things off by laying out Microsoft's vision for the Copilot+ experience:
"Windows has always believed in making technology accessible to everyone. Today, we’re carrying that belief into the new era of AI with a reimagined core architecture for the PC, weaving AI into every layer, from the chip to Windows to the cloud."
When Davuluri said Microsoft was "weaving AI into every layer" of your laptop, he wasn't kidding. The feature he announced just seconds later proved it: Recall.
Microsoft Recall uses AI to track and interpret everything you do on your Windows 11 computer. At regular intervals, it takes a screenshot of whatever is on your display and saves it in case you ever want to recall what you were doing, searching for, or working on at a later time.
For example, you could ask Copilot to remember a pair of shoes you were shopping for a month ago or find an essay draft you lost in your files last week.
That might sound useful, but I'll admit that as I watched Davuluri demonstrate Recall that day, I was more concerned than impressed. Microsoft quickly discovered that I wasn't the only one concerned, perhaps even scared.
Having an AI watch everything you do all the time, in case you need help finding it later, requires a lot of trust in a big tech company. As it turns out, most people don't trust Microsoft, leading to an explosion in controversy over Recall.
Over the months following that original announcement, Microsoft learned the hard way that privacy matters far more than convenience in this early era of AI for all. The controversy — as you'll read below — might make future AI programs more thoughtful about how artificial intelligence uses their data.
Microsoft Recall is watching, even when you tell it not to
Early in his Recall demo, Davuluri explained that users can control what Recall sees by turning it off or blocking it from screenshotting specific websites. He noted, "Even the AI running on your device can’t access private content."
Unfortunately, this anecdote was not enough to quiet fears about the risks Recall posed for privacy and security. When the feature was initially announced on May 20, the day before Microsoft Build 2024 kicked off, Microsoft had planned to launch Recall as a preview feature a month later on June 18, 2024. However, the backlash over privacy concerns was so intense that Microsoft paused that release... a long pause.
People were quick to notice serious vulnerabilities in Recall. On May 31, just days after the feature's announcement, Kevin Beaumont, a cybersecurity expert and former Senior Threat Intelligence Analyst at Microsoft, published a post on Medium explaining the slew of security risks he found in Recall. The article's title says it all: "Stealing everything you’ve ever typed or viewed on your own Windows PC is now possible with two lines of code — inside the Copilot+ Recall disaster."
Beaumont was one of many cybersecurity experts calling out what he saw as significant issues in Recall's design. Users were right to be concerned, too. As Beaumont revealed, the original version of Recall stored data as plain text files with little to no protection from unauthorized access. There was also the apparent risk of the AI recording sensitive information, such as private conversations, passwords, or financial data.
Just a couple of weeks after Recall's announcement, on June 7, 2024, Microsoft finally responded to the outcry. It added clear yes and no buttons to opt-in for Recall during the setup process for Windows 11 PCs and required Windows Hello to use Recall for an added layer of security.
Microsoft delayed the preview release of Recall twice, with updates and attempts to address the plethora of security issues throughout the year. Finally, on December 6, Recall launched as a preview feature only available to members of the Windows Insider Program.
Unfortunately for Microsoft, even after months of reworking the feature, security issues with Recall were spotted within days. Tom's Hardware caught Recall capturing credit card numbers and social security numbers even with the "sensitive information" filter turned on.
Apparently, when AI is woven into "every layer" of the Windows experience, it's hard to unweave it when you want your privacy back. The rampant security concerns with Microsoft Recall have a critical lesson for tech companies and AI developers: convenience is not worth sacrificing privacy for most people.
The Recall recall and the importance of privacy in the age of AI
I've noticed a recurring theme this year: AI can do many cool things, but it does so at the cost of enormous amounts of data, often your data.
This reality is that AI still creates friction across many audiences, from celebrities campaigning against deepfakes to everyday users concerned about privacy with new Windows features.
AI is finding its way into our tech, for better or worse. There are some incredible ways AI can be used for good, even to save lives. However, the need for massive amounts of data to power and create AI models also creates very real possibilities for this tech to be used for harm, even to ruin lives.
We need to reckon with that for AI to be part of a future where technology is used to build a better world. The Microsoft Recall debacle should serve as a warning for other tech companies. Over the past few decades, tech has been used to make our lives more convenient, but Recall proved there's a limit to that.
The reality — right now, anyway — is that most people don't want tech to do everything for them. The ability to have an AI remember where you store your files is not worth letting that same AI track everything you do on your laptop. It's certainly not worth the risk of a hacker snatching your social security number from a screenshot with just a couple of lines of code.
In the age of AI, there's at least one thing far more critical than convenience: privacy and respecting users enough to let them control it.
If you're anything from an AI enthusiast to the average AI tinkerer (or simply seeking out some of the additional features offered through Windows Copilot+ PCs or Apple Intelligence on Macs), then you'll need a powerful and performative laptop to keep up to speed with your needs.
At Laptop Mag, we review laptops year-round to ensure we're giving you expert-backed and up-to-date recommendations on which notebook is right for you. When it comes to the best AI PC category, our top picks are the excellent Asus Zenbook S 14 (UX5406) for Windows users and the impressive Apple Macbook Air M3 for those running macOS.
So, if you're shopping for a new laptop and looking to invest in an AI PC (or just a great laptop in general), check out our current top-tier picks below.
Best Mac for AI
We love the MacBook Air 13 M3. Starting at just $1,099 (MSRP), with education pricing dropping to $999 (MSRP), the Air is a laptop we can recommend for just about any purpose. It's affordable, especially by Apple standards, and it features an excellent keyboard, fantastic performance, and outstanding endurance (over 15 hours of battery life), which makes it a great laptop for just about anyone's needs, especially those interested in getting to grips with all of the latest Apple Intelligence features.
Best Windows AI PC
The Asus Zenbook S 14 (UX5406) has quickly become our favorite AI PC laptop of the year, offering all the hallmarks of a great buy, including exceptional performance and battery life. This laptop is one of the first to feature an Intel Core Ultra 200V series processor and at just $1,499 (MSRP), you get a fantastic balance of power, a stunning 14-inch OLED display, effortless multitasking, NPU-enhanced performance for AI tasks, and all of the additional Copilot+ features available with Windows 11.