
Previously known as Mobile World Congress before its modern rebrand to MWC, the trade show was a repeated attraction for us when Windows Phone was all the rage.
These days, it's evolved into a much broader event that hosts prominent PC manufacturers showing off their latest mix of enterprise and commercial hardware alongside a handful of consumer devices.
At the center of it all, and my primary attraction to the exhibition, is a giant booth hosted by Intel as it unveiled its commercial AI PC portfolio powered by the Lunar Lake mobile processors I had previewed last year in Taiwan and its Arrow Lake successors.
Intel is facilitating accessible AI for all

It's been too easy to admire what Intel is achieving with its mobile chips, especially with Core Ultra Series 2 pushing what felt like more than a single generational uplift in efficiency and total battery life over its Meteor Lake predecessors.
After a tour of the Xeon 6 platform aimed at data centres, I had sincere chats with Intel's AI PC demo expert Craig Raymond, bouncing my thoughts on the category and its appeal to everyday consumers.
Raymond, clearly filled with the same passion for AI computing as I am, responded to my quest of finding the "killer AI app" at MWC with a series of demonstrations of agentic AI and indulged me in my appreciation for Intel's efforts in OpenVINO, a suite loaded with plugins and access to large language models (LLMs) running locally.
Here, we came to a mutual recognition that local inferencing, powered by the dedicated neural processing units (NPUs) in processors just like Intel's, could be the breakthrough hit that artificial intelligence needs to escape a couple of common misconceptions among the average consumer.
A broad misunderstanding of AI PCs

While I knew that my goal was to discover what the killer AI app might be to convince consumers that AI PCs are as revolutionary as our coverage suggests, I approached each brand at MWC with the same premise: I'm here to decipher the jargon and explain the appeal to consumers like my parents.
They aren't technophobes by any means, but they don't keep up with technology in the same way that I do, left at the mercy of advertising and a growing collection of acronyms.
After all, why should my parents know what TOPS are? Microsoft set a 40 TOPS minimum requirement for Copilot+ PCs, but I wouldn't expect them to learn about tera operations per second, even while they see Copilot advertisements in their daily travels.
In fact, my extended family still associates artificial intelligence almost entirely with generative AI like DALL-E for images, further intimidated by the alarming rise of deepfake images and voice-cloning tools unfortunately harnessed for telephone scams.
To them, "AI" is a negative buzzword and hardly anything they might link to the efficiency of their next laptop, and that's where I come in as someone who benefits first-hand from Intel's presence in AI.
AI PCs will quietly become the standard

As instrumental as AI PCs have been in driving what my editor-in-chief Daniel Rubino called a 'Great Reset' in the Windows PC industry, it's clear to me that the subcategory will eventually become insignificant.
It's not because the AI PCs themselves lack significance; on the contrary, they formed the single most important shift in personal computing in years — even decades.
Rather, the groundbreaking architecture inside will eventually feel less fantastical and more normal as mobile and desktop processor SoCs (system on chips) continue to include at least a CPU, iGPU, and NPU.
Will we need to talk about "AI PCs" in a few years to come? We don't think so, because in the future, it will be just a PC; that's a new definition of a PC.
David Feng, Vice President and General Manager, Client Segments, Client Computing Group at Intel
Local AI computing has been supported on CPUs and GPUs (both discrete and integrated) for years, with only the recent prevalence of low-power, ultra-efficient NPUs causing a significant intrigue with their ability to handle AI-centric tasks in the background, measured in milliwatts rather than watts.
As Intel iterates on its Lunar Lake NPU and AMD maintains focus on its Ryzen AI chips, the concept of processors lacking AI capabilities becomes a niche prospect — there aren't any logical reasons to backtrack on local inferencing if manufacturing costs aren't negatively affected.
In this, we'll likely stop referring to future laptops as "AI PCs" and just call them PCs. That's exactly how important artificial intelligence is to computing, and why it's crucial to navigate misconceptions around the term. AI is about much more than "thoughtful" conversational LLMs like ChatGPT.
The killer app will be the AI PC itself

My epiphany at MWC is that a single app likely won't be responsible for revolutionizing AI PCs or act as the catalyst that transitions them into the zeitgeist of standardized computing to become "just a PC."
However, allowing everyday users to harness local AI in a secure manner without sending a scrap of data to the cloud has the strongest chance of being a game changer, packaged as tens if not hundreds of different apps with varying styles.
Companion apps on PCs like myHP on the OmniBook Ultra Flip 14 and Vantage on Lenovo's Yoga Slim 9i will undoubtedly continue to expand their AI-powered dashboards with local document analysis, including instruction manuals for that same laptop — a clever way of having the device answer questions about itself.
This idea of this 'from the heavens' killer app isn't the right expectation (..) it's going to build into this thing where at some point you're like 'oh, I don't ever want to be back on a non-AI PC.'
Cory McElroy, Vice President of Commercial Product Management at HP
There will be more examples that better harness the trio of AI processing units inside an SoC, but they likely won't appear as a majority from OEMs like Dell, ASUS, and even Microsoft.
Instead, these giants seem more focused on utilizing their gargantuan enterprise presence to refine cloud-based AI into a package that translates from huge server farms to the commercial market and finally to consumer devices in an ultra-efficient manner.
Our killer app is going to be written by customers, and Dell is providing the capabilities for them to go write it.
Spencer Bull, Innovation Architect at Dell
Essentially, the top brands are doing all the hard work in artificial intelligence computing before it ever comes to an AI PC that my parents would likely buy in a brick-and-mortar electronics store.
Sure, that still leaves me hearing the question of "but what is my NPU actually doing?" and it's totally valid to anyone who watches the Windows 11 Task Manager to see little activity.

Right now, unless you're regularly harnessing OpenVINO plugins in apps like Audacity to split music into separate stems and transcribe speech into exported text files like I do, then you likely won't see that NPU pulse moving much.
However, more developers will offload background tasks to the low-power chip as time goes on, which has the brilliant side effect of increasing laptop battery life and reducing processor temperatures.
It's just a matter of which software giant makes the biggest splash first, and you'll be hard pressed to avoid the news when it happens as any power-saving performance bumps are always touted in graphs and one-page press announcements.
For now: there is no killer AI app, and its implied importance is far lower than I had previously thought — AI will just continue to make our PCs better until we forget it's even there.