- Apple is redesigning Siri by leveraging Apple Intelligence and models like Gemini, with a phased rollout in iOS 26.4, 26.5 and 27.
- The first key improvements are personal context, screen awareness, and cross-app actions via App Intents.
- The roadmap culminates in a chatbot-like Siri capable of long conversations, content generation, and deep system control.
- Strict privacy and limited compatibility with recent devices will determine which users can take full advantage of the new Siri.

If you've ever thought that Siri had fallen behind ChatGPT or GeminiYou're not alone. For years, Apple's assistant has been very useful for quick tasks, but short-sighted when we ask it for something more complex, especially when compared to the latest generation of chatbots.
We are now at a turning point: Apple is reinventing Siri by relying on Apple Intelligence and in models like Google GeminiWith a roadmap unfolding across several iOS versions (26.4, 26.5, and 27) and also affecting iPadOS, macOS, and CarPlay, this is no small change: it touches on privacy, internal architecture, how the iPhone is used, and even Apple's relationship with other AI giants.
What is Siri today and why is Apple redesigning it?
Siri was born as a virtual assistant exclusive to the Apple ecosystem.Available on iOS, macOS, watchOS, and tvOS, it uses natural language processing to understand your requests, contact web services, and provide a response or perform an action: set an alarm, send a message, create a note, or check the weather.
In its origins, Siri was a project of SRI International (SRI Venture Group)Siri originated from a DARPA research program called CALO, considered one of the most ambitious artificial intelligence projects of its time. In 2010, Apple acquired the company behind Siri, canceled its plans for Android and Blackberry, and integrated the assistant natively into the iPhone 4S in 2011.
Since then, Siri has expanded to more languages, more countries, and more platforms.It arrived on the iPad, Apple Watch, Apple TV, and Mac (with macOS Sierra in 2016). With iOS 6, it improved integration with apps like Maps and Reminders, added Bluetooth compatibility for cars, and features like "Eyes Free," designed to use Siri from the steering wheel controls.
The assistant relies on an ecosystem of external services that feed its responses: OpenTable or Yelp for restaurants, MovieTickets or RottenTomatoes for moviesTrue Knowledge, Bing Answers, or Wolfram Alpha for reference data, in addition to Bing and Google for the web. All of this allows you to answer questions, make reservations, or get contextual information.
On a practical level, Siri has always been a very useful assistant for specific tasksCheck the weather, find a nearby restaurant, schedule appointments, dictate messages, search the web, or open apps. It even has a more playful side with jokes, raps, stories, and quirky answers that have become popular among users.

But faced with competitors like Google Assistant, Amazon Alexa or Samsung BixbyAnd more recently, with the rise of large language models like ChatGPT, Gemini, and Claude, Siri has shown certain limitations: a relatively closed system, partial integration with third-party apps, and a feeling of stagnation in the conversational aspect.
Privacy and design: the most discreet assistant in the ecosystem
One of the pillars that Apple continually repeats is that Siri is designed to be a very personal but extremely discreet assistant.The system learns your routines and preferences, but it does so in a way that what you ask for is not associated with your Apple account as a direct identifier.
Thanks to the Neural Engine in its chips, Much of the speech and language processing is done directly on the device —iPhone, iPad, or Apple Watch—. The audio of your requests does not leave the device unless you explicitly give permission to share it, and Apple emphasizes that conversations with Siri are not used for advertising purposes.
On this basis he builds his vision of Apple Intelligence and Private Cloud ComputeThese models run on the device itself for normal requests and, when more power is needed, rely on servers designed with an extra layer of privacy. According to Craig Federighi, the idea is that when a model receives your question, the data remains private and is not stored or used to train third-party models.
This approach, which sometimes slows down the development of features based on personal data, also explains some of the delays: Protecting your privacy to the fullest extent makes it much harder for Siri to access your personal information., such as messages, emails or files, just what is needed to make it truly "smart" in everyday life.
How to activate Siri today on iPhone, iPad, Mac, Apple Watch, and CarPlay
Beyond the new AI, The way to summon Siri remains key to using it seamlessly.Apple offers different methods depending on the device and your accessibility settings.
On iPhone, after setting up the assistant, you can invoke it with your voice by saying "Siri" or "Hey Siri"You can also press and hold the side button on models with Face ID or the Home button on older models. If you're using wired EarPods, simply press and hold the center button; with CarPlay, you can use the voice command button on the steering wheel or the Home button on the CarPlay screen.
On the Apple Watch, especially from Series 3 onwardsYou have several options: raise your wrist near your mouth and speak, press and hold the Digital Crown, or use voice commands if you've enabled always-on listening. On Macs with an Apple chip, you can also say "Siri" or "Hey Siri," use the dedicated icon in the menu bar, and configure keyboard shortcuts.
For those who prefer to write, there is the function “Write to Siri” within AccessibilityIt's activated from Settings > Accessibility > Siri, where you can enable Type to Siri on iPhone or iPad. Once activated, simply press and hold the Siri button and type your request, which is especially useful in situations where you don't want to speak aloud.
When you use a physical button to open Siri, The voice response adapts to Silent mode On the iPhone: if enabled, the assistant responds silently; if not, it speaks aloud. If you invoke it with your voice, Siri usually responds aloud, although you can adjust this behavior in the assistant's settings.
CarPlay, third-party chatbots, and Siri's role in the car
So far, The car was practically Siri's exclusive territory.In CarPlay, the only voice-controlled assistant was Apple's, with minor exceptions such as voice search in map apps. That's about to change.
Apple works to ensure that Artificial intelligence applications with voice mode, such as ChatGPT, Gemini or Claude, can have their own app in CarPlayThe idea is that you can be driving, ask a work-related question, or inquire about a town you're passing through, and talk to one of these chatbots without taking your hands off the wheel.
However, the company sets very clear limits: You won't be able to replace the Siri button on the steering wheel or use a wake word like "Hey ChatGPT" or "Ok Google"Siri will remain the only officially integrated assistant for controlling the system at the car level.
In practice, this means that To use a third-party chatbot, you will first need to open its application in CarPlay.Developers will be able to have voice mode activate automatically when the app is opened, but there will always be that initial step of tapping the screen. Furthermore, these chatbots won't be able to control car or iPhone functions: they will focus on answering complex questions, not on activating lights, changing system music, or initiating calls from the device outside of their own app.
Meanwhile, Apple is preparing its own improvements: With iOS 26.4, Siri will begin to rely on a Google Gemini model within Apple's infrastructure. And later, with iOS 27, it plans to offer a fully conversational assistant that will compete directly with ChatGPT, Gemini, and Claude.
Roadmap: from iOS 26.4 and 26.5 to a chatbot-like Siri in iOS 27
Apple has been making announcements and delays about the new Siri for some time now. At WWDC 2024, Apple unveiled Apple Intelligence and a much more powerful Siri.capable of using personal data, understanding the screen and controlling apps in an advanced way, with an initial promise of arrival in early 2025.
However, The company kept postponing the launchFirst, it moved it to 2026, without giving specific dates, and later internally linked the release to iOS 26.4, a spring update that has historically brought important changes (like 16.4, 17.4 and 18.4).
According to various leaks, including those from Bloomberg and Mark Gurman, The current plan involves a phased rollout between iOS 26.4, iOS 26.5, and iOS 27.The idea: to use 26.4 as a structural base, 26.5 as an expansion still in testing of several key functions, and 27 as a large commercial showcase with the fully chatbot Siri.
In the final weeks of internal testing, however, major problems have emerged: slowness, accuracy errors, an inability to handle complex queries well, and a particularly annoying bug that causes Siri to interrupt the user if they speak too quickly. Furthermore, the assistant sometimes delegates simple tasks to its ChatGPT integration when it should be able to handle them on its own.
These setbacks have forced Apple to Consider delaying some of the features planned for iOS 26.4 until iOS 26.5 or even iOS 27The employees themselves who test the internal versions have acknowledged that, by the end of 2025, the system was so slow that several extra months of work seemed inevitable.
What iOS 26.4 will bring to Siri: personal context, screen, and actions in apps
iOS 26.4 is presented as a Structural upgrade that lays the technical groundwork for the new SiriMore than just a big marketing headline, Apple is preparing the first developer beta for the week of February 23, a public beta shortly after, and a final release between the end of March and the beginning of April.
The big news is that Apple Intelligence is really starting to reach Siri on three fronts: understanding personal context, awareness of what is on screen, and actions in and between applications through the App Intents framework.
First, the Personal context allows Siri to have something similar to short-term memory. and be able to understand pronouns, vague references, and data that already reside on your device: Messages or WhatsApp conversations, Calendar events, Mail emails, or recent notes.
Second, the Screen awareness will make it possible to refer to "this", "that file" or "this message" Without copying and pasting or explaining everything from scratch. The system will know what content is relevant in the app you're using and will be able to act on it, provided the application exposes the information properly through the appropriate APIs.
Lastly, App Intents will move from being hidden behind Shortcuts to becoming the language that apps use to connect with Siri.Each developer defines what their application can do (create tasks, modify reservations, send internal messages, process payments, etc.) and the system can chain these actions together based on a natural language command.
Features that could be delayed until iOS 26.5
Although some of these capabilities were intended for iOS 26.4, Recent tests suggest that some will arrive in a limited form or in "preview" mode in iOS 26.5Apple has instructed its engineers to test the new Siri features on internal builds of 26.5, suggesting a partial delay.
One of the characteristics in the air is the Siri's expanded ability to access deep personal dataThe goal: that you can ask it something like "find the podcast that Marcos sent me by message a few months ago and play it" and have the assistant locate that link in your old conversations and launch it directly.
Internal versions of iOS 26.5 include a button that activates a preview of these advanced features, with the idea of alerting the user that it is something experimental, likely to fail or change, in the style of public betas of the system.
They are also delayed More powerful commands to control actions within appssupported by App Intents. The typical example would be saying "find a photo, retouch it and send it to Laura", and Siri being able to locate the image, apply basic editing and share it, all with a single command.
Employees who are testing iOS 26.5 report that There is initial support for these features, but their reliability still leaves much to be desired.That's why Apple is considering releasing them as a preview, keeping expectations under control while it gathers usage and bug data.
The big bet: Siri as a chatbot in iOS 27 and the “Field/Fields” project
Beyond 26.4 and 26.5, the big commercial leap will come with iOS 27, iPadOS 27, and macOS 27, which will debut a Siri transformed into a chatbot codenamed “Field” or “Fields”. The idea is to replace the current interface with a system-wide integrated chat experience, accessible via both voice and text.
This new Siri-chatbot will be able to hold long conversations in the ChatGPT style, with a continuous thread where you can request summaries, text generation, image creation, advanced web searches and management of complex tasks without leaving the conversation.
At a technical level, The architecture is based on the Apple Foundations Models platformwhich now incorporates technology from the Google Gemini team. Part of the processing will be done on the devices themselves (mainly iPhones and Macs with recent chips) and another part in data centers with chips designed by Apple, within Project Baltra, focused on AI processing in the cloud.
The company is also experimenting with a standalone app to manage previous interactions with the chatbotThis is similar to a history that you can review, edit, or reuse for new tasks. From here, you can, for example, retrieve a conversation in which you created a document or image and continue the thread.
A key part of this next-generation Siri will be deep control of system functions and secure access to personal dataFiles, emails, events, contacts, notes, and more. Apple wants you to be able to ask it to reorganize your day, review documents, look up important information, and take action in apps like Mail, Calendar, or Safari with high-level commands.
Apple, Google and the Gemini agreement
Siri's transformation cannot be understood without The strategic alliance between Apple and Google to integrate Gemini into Apple IntelligenceThis agreement, which became visible at the beginning of the year, is key to providing generative muscle to the assistant and other iOS functions.
Interestingly, Public communication of the pact has been quite asymmetricalThe announcement appears primarily on Google's channels, while Apple's official platforms offer little detail. The deal exists, but the visibility is largely on Alphabet's side, raising questions among analysts and investors.
In a conference with Alphabet investors, Analyst Ken Gawrelski explicitly asked how this type of agreement fits in with partners like Apple.In a context where platform usability and generative AI may outweigh traditional search clicks, Philipp Schindler's response focused on overall search performance and AI integration, but he avoided providing specific details about the deal with Apple.
In the Apple investor call, Tim Cook was somewhat more direct on technological matters, but just as evasive on contractual issues.He argued that Google's AI provides a very capable foundation for Apple's Foundation Models and that together they will unlock key experiences, but made it clear that the terms of the agreement will not be disclosed.
For investors, These results conferences have become a crucial window into understanding the extent of the explanations. about agreements that can affect product, strategy, and billing. Although companies avoid the details, the pressure for clarity is increasing as AI becomes central to their value proposition.
Impact of this Siri reboot on the ecosystem and users
The relaunch of Siri is not just a matter of technological prestige: Apple is risking part of the perceived value of the iPhone and its ecosystemiPhones already account for around 20% of global smartphone sales and 25% of the global installed base, with approximately 1.500 billion active iPhone users.
In markets like Spain, iOS has around 28-29% mobile market share And Apple is the leading manufacturer in terms of active device usage, ahead of Samsung. Every real improvement to Siri means millions of people can get more out of a device they've already paid for, without needing to upgrade their hardware.
However, The most advanced features of Apple Intelligence and the new Siri will be restricted to recent modelsAs with the first wave, this update is limited to iPhone 15 Pro, iPhone 16, and Macs with an M1 chip or higher. This leaves out a significant percentage of the market, around 20-30% according to some analysts.
That bottleneck serves a dual purpose: It can encourage device upgrades in a mature marketBut it can also generate frustration among those who see that "their" Siri doesn't resemble the one Apple showcases in presentations. The feeling of using a scaled-down version of the assistant can negatively impact brand perception.
In parallel, The accumulated delay comes at the worst possible time.While the world gets used to interacting with ChatGPT, Gemini, or Claude, the iPhone's native assistant has continued to function, to a large extent, as it did in 2016. Each month of delay occurs on a base of hundreds of millions of active devices.
Real-world examples of using the new Siri
To understand all this technical setup, it's helpful to imagine how Everyday life will change with the new Siri under iOS 26.4 and, later, 27Here are three very specific scenes.
First, think of a complicated meeting: You have an email with an open contract and a chat with your lawyer.With iOS 26.4, you could say something like, “Siri, move this meeting with Ana to the first available time this afternoon and resend the contract to her.” The system will use personal context (who Ana is), calendar information, and on-screen content to make the changes and send the correct document.
With iOS 27, The same situation becomes richerSiri could suggest alternatives, suggest adding the Compliance Officer, draft an email with legal nuances, and ask if you want to save a modified version of the contract, all in a continuous conversation with the chatbot.
Second example: a business trip with last-minute changesTickets in Wallet, reservations in PDFs within Files, and meetings in Calendar. In version 26.4, you could ask: “Postpone all Thursday meetings to Friday, notify those affected, and change the return flight to the first flight Saturday morning.” Siri would have to coordinate airline App Intents, your calendar, and messaging apps to execute that set of actions.
In 27, the assistant could go one step further: rethink the entire trip, suggest alternative hotels, recalculate the budget and prepare a summary ready for Finance, all without leaving the conversation thread where you are approving each change.
Third scenario: a day saturated with scattered informationYou're reading a macro report in Safari, receiving a market research study in the email, and brainstorming ideas for a presentation in Notes. In 26.4, you'll be able to say, "Siri, create a note with the key ideas from this and set a reminder for me tomorrow at 8 to review it," and the system will use screen awareness and personal context to capture the relevant content.
With the chatbot version 27, You can ask it to summarize both documents, cross-reference data, generate a presentation outline, and suggest several headlines. to share them directly with your team, all within a single conversational flow where you refine the result.
Looking at the whole, Siri's new phase combines an aggressive push for generative AI with a staunch defense of privacy.By leveraging its own models, the alliance with Google Gemini, and a profound redesign of how apps are exposed to the system, if Apple gets the execution right and manages to get developers and users on board, the assistant that for years has been "for simple things" can finally become the layer that orchestrates almost everything you do with your iPhone, your iPad, your Mac, and even your car.