AI5 min readThe Verge AI

Apple will reportedly allow other AI chatbots to plug into Siri

P
Redakcja Pixelift0 views
Share
Apple will reportedly allow other AI chatbots to plug into Siri

Foto: Photo of a hand holding iPhone with new Siri and ChatGPT.

iPhone users will no longer be limited to a single AI solution—the upcoming iOS 18 update will introduce an "Extensions" feature, allowing for the seamless integration of Siri with third-party chatbots. According to reports from Bloomberg’s Mark Gurman, Apple plans to open its ecosystem to popular models such as Google’s Gemini and Anthropic’s Claude. The new system will enable users to download chatbots directly from the App Store and assign them to the voice assistant, representing a significant expansion of the current partnership with OpenAI and their ChatGPT. For the global community of iPad and Mac users, this marks the end of Apple’s "walled garden" era regarding artificial intelligence. Instead of relying on a single engine, users will be able to personally manage which model best handles specific queries or tasks within applications. Furthermore, Apple is reportedly using Gemini technology to train its own smaller AI models, signaling a profound overhaul of Siri into an autonomous bot capable of taking actions on behalf of the user. The official presentation of these changes is expected during the WWDC conference, which begins on June 10. This decision will force AI developers into even fiercer competition for a spot in the pockets of millions, transforming Siri into a universal interface for the world’s most powerful language models.

Apple's strategy in the field of artificial intelligence is undergoing a fundamental transformation. The Cupertino giant, which for years kept Siri within a tightly sealed proprietary ecosystem, has decided to open the door to the competition. According to the latest reports from Mark Gurman of Bloomberg, the upcoming iOS 27 update will introduce a revolutionary feature called Extensions. This solution will allow users to directly connect Siri with third-party AI chatbots, completely changing the way we interact with iPhones, iPads, and Mac computers.

This decision is a continuation of the path set by the earlier integration with OpenAI’s ChatGPT. This time, however, Apple is not limiting itself to a single partner. The "Extensions" system will allow a choice from a wide range of language models available in the App Store. This means that Siri, instead of relying solely on its own algorithms or a single external provider, will be able to delegate queries to powerhouses such as Google’s Gemini or Anthropic’s Claude.

Search and AI interface
Siri's new architecture is to be based on flexibility and user choice regarding AI models.

The end of monopoly and the era of AI extensions

The introduction of the Extensions system is a signal that Apple understands the limitations of its own models when faced with specialized chatbots. Users will gain full control over which AI tools have access to their voice assistant. System settings will feature the ability to enable and disable individual chatbots, resembling the mechanism for managing app permissions. As a result, Siri will become an intelligent intermediary (hub) capable of pulling answers from the most competent source at any given moment.

Significantly, this integration will not be limited to simple question forwarding. The new "Extensions" are intended to work with Apple's planned standalone application for the refreshed version of Siri. This new assistant is expected to have the capability to take actions on behalf of the user within various applications. Key features of the upcoming system include:

  • Full integration with Google's Gemini and Anthropic's Claude directly through the Siri interface.
  • The ability to manage chatbots at the system level (enabling/disabling individual models).
  • Support for devices within the iPhone, iPad, and Mac ecosystem.
  • Utilization of external models to perform complex tasks within third-party applications.

From an industry perspective, this move is extremely pragmatic. Instead of trying to overtake Google or OpenAI in the race for language model size, Apple is creating a platform where these giants must compete with each other for the attention of the iOS user. This is a classic "walled garden" strategy, where Apple maintains control over the user interface while offering them the best technologies available on the market.

Alliance with Google and model training

While opening up to multiple providers is a novelty, the foundation of the changes in Siri seems to be a tightening collaboration with Google. Reports from earlier this year indicated that Apple was testing the use of Gemini technology to power key assistant functions. Latest information from The Information sheds new light on this: the agreement between the giants is said to cover not only the provision of answers by the chatbot but also Apple's right to use Gemini to train its own smaller AI models.

AI technology illustration
Collaboration with Google and Anthropic is intended to fill gaps in Siri's native generative capabilities.

This strategic partnership allows Apple to develop on two tracks. On one hand, the company is building its own privacy-optimized models that run locally on devices (on-device AI). On the other, it provides users with access to powerful cloud-based models like Claude or Gemini when a task requires more computing power or specific knowledge. Such a hybrid model seems to be the only way to create an assistant that is both secure and truly intelligent.

Apple intends to present the latest versions of its operating systems during the Worldwide Developers Conference (WWDC), which begins on June 8. It is then that we will learn the final details regarding iOS 27 and how "Extensions" will change daily work with the brand's devices.

A new paradigm for the voice assistant

Transforming Siri into a bot more closely resembling ChatGPT is a process that has been underway for months. Apple had to close the gap with the competition, which revolutionized the concept of computer interaction in 2023 and 2024. Leaks about iOS 27 suggest that Siri will stop being just a tool for setting reminders and checking the weather. Thanks to integration with external models, the assistant will gain the ability for deep context analysis, code generation, text writing, or advanced data editing in office applications.

For developers from Anthropic or Google, presence in the "Extensions" system is a huge opportunity to reach billions of users without having to fight for the user to manually open their dedicated app. For Apple, in turn, it is a way to avoid allegations of monopolistic practices – by giving a choice between Claude, Gemini, and ChatGPT, the company positions itself as a neutral platform providing the best AI solutions.

Everything indicates that Apple no longer intends to fight for the title of creator of the "best AI in the world." Instead, the company wants to create the "best interface for AI." Opening Siri to third-party extensions in iOS 27 is an admission that in the world of generative artificial intelligence, the key to success is not isolation, but the orchestration of many different models. If the reports of the June premiere at WWDC are confirmed, we are in for the most significant change in iPhone history since the introduction of the App Store. Siri as we know it is becoming history, giving way to a flexible ecosystem of intelligent extensions.

Source: The Verge AI
Share

Comments

Loading...