How Extensions Actually Work: The App Store as AI Pipeline
The mechanism is more interesting than the headline. Apple isn't building a neutral, OS-level API that AI vendors plug into directly — instead, providers must ship Extensions support inside their existing App Store apps. A user installs the Gemini app, the Claude app, or ChatGPT, then opens Settings to designate one as the preferred Apple Intelligence provider. From that moment, Siri queries, Writing Tools rewrites, and Image Playground generations route through whichever app the user picked, with Apple acting as the dispatcher rather than the model host.
Apple is also planning a dedicated 'Extensions' section in the App Store as a marketplace for compatible AI apps, and — in a small but telling design choice — different Siri voices will be assignable to different models so users can audibly tell whether Apple's own Siri or a third-party LLM is answering. That voice-differentiation detail is an admission of accountability risk: when Claude hallucinates inside 'Siri,' Apple wants the user to know it wasn't Siri. The unresolved question is approval. Reporting suggests it's not yet clear whether Apple will gatekeep which AI providers can offer Extensions, leaving room for Apple to either run an open marketplace or selectively bless partners.



