Apple gave Siri an upgrade as part of the rollout of Apple Intelligence that allowed it to better understand context and follow-up queries, but it doesn't have the brain power or conversational capabilities of ChatGPT.
A recent report from Bloomberg suggests Apple is now hard at work on an "LLM for Siri" that would allow the AI voice assistant to hold a conversation, much like Google Gemini Voice, Meta AI Voice or ChatGPT Advanced Voice.
Apple already uses an on-device language model to power Apple Intelligence. This isn't good enough to support effective conversational AI like ChatGPT which uses a much larger, cloud-based model.
According to Bloomberg, LLM Siri would use an entirely new Apple AI model and be able to draw on App Intents more heavily. These allow developers to open up features within an app to Siri so you can talk to it with AI.
Why does Siri need an LLM brain?
I have previously written about how ChatGPT integration into Siri largely makes the voice assistant pointless. Everything creative just gets sent to ChatGPT.
Siri feels outdated and a relic from a different era. It is better with the recent updates, but it isn't close to Gemini Live and that is a problem for Apple in its competition with Android.
Giving Siri a dedicated LLM brain and voice mode would likely need to be run from models on its own custom-built private cloud rather than on device. This will take time to build out infrastructure for a global-scale voice assistant.
The report by Mark Gurman suggests LLM Siri might initially be a separate app while users provide feedback. An announcement is likely at WWDC in June next year, and integration will come as part of iOS 19 and macOS 16, but it may not actually go live until Spring 2026.
In the meantime, Apple is looking to expand the ChatGPT integration in Siri by bringing in other partners such as Anthropic's Claude and Google's Gemini. This would act as a 'stop gap' until the Siri LLM is ready.