I’ve been thinking about Apple’s position in AI a lot this week, and I keep coming back to this idea: if Apple is making the best consumer-grade computers for AI right now, but Apple Intelligence is failing third-party developers with a lack of AI-related APIs, should the company try something else to make it easier for developers to integrate AI into their apps?
Gus Mueller, creator of Acorn and Retrobatch, has been pondering similar thoughts:
A week or so ago I was grousing to some friends that Apple needs to open up things on the Mac so other LLMs can step in where Siri is failing. In theory we (developers) could do this today, but I would love to see a blessed system where Apple provided APIs to other LLM providers.
Are there security concerns? Yes, of course there are, there always will be. But I would like the choice.
The crux of the issue in my mind is this: Apple has a lot of good ideas, but they don’t have a monopoly on them. I would like some other folks to come in and try their ideas out. I would like things to advance at the pace of the industry, and not Apple’s. Maybe with a blessed system in place, Apple could watch and see how people use LLMs and other generative models (instead of giving us Genmoji that look like something Fisher-Price would make). And maybe open up the existing Apple-only models to developers. There are locally installed image processing models that I would love to take advantage of in my apps.
The idea is a fascinating one: if Apple Intelligence cannot compete with the likes of ChatGPT or Claude for the foreseeable future, but third-party developers are creating apps based on those APIs, is there a scenario in which Apple may regain control of the burgeoning AI app ecosystem by offering their own native bridge to those APIs?
Essentially, I’m thinking of a model similar to what Cursor, Perplexity, and dozens of other AI companies do: instead of necessarily bringing your own API key, you can use an abstraction layer in the middle that absorbs all the costs of the API you’re calling – usually, for a monthly fee and within certain limits. What if Apple followed a similar approach in iOS 19/macOS 16 with an Apple Intelligence API that is actually a bridge between native apps and other cloud-based AI providers?
It may sound silly at first, but stay with me for a second. The first question that would need to be answered is: why should a developer even consider this instead of rolling their own API key for a third-party provider? Cost is the first factor that comes to mind.
I would imagine that, unlike indie developers or smaller companies, Apple would have the power to negotiate much lower API fees with OpenAI or Anthropic, offer some kind of “entry-level” access for free as part of the Apple Developer Program, and give developers a more convenient pricing structure for apps that want to perform a lot of AI calls on a monthly basis.
There are precedents here. For starters, we’re talking about a company that negotiated free ChatGPT use in Siri with OpenAI in exchange for, well, exposure. I’m pretty sure neither OpenAI nor Anthropic would mind becoming officially-sanctioned AI providers in a theoretical Apple Intelligence SDK. Second, Apple does currently offer a web service with an API based on both free access and flexible pricing tiers: WeatherKit. If you’re a developer of a weather app, you can perform up to 500,000 monthly calls for free as part of your Apple Developer Program membership; above that, there are multiple pricing tiers. In the case of AI, those would be subscriptions paid by developers and going directly to Apple instead of OpenAI or Anthropic. Plus, for WeatherKit, Apple provides developers with extensive documentation and libraries on how to integrate the technology into their native apps. The service is advertised as “easy to use” with “privacy first”.
Which brings me to my second point. The other feature that I could see Apple market for a “ChatGPT/Claude via Apple Intelligence” developer package is privacy and data retention policies. I hear from so many developers these days who, beyond pricing alone, are hesitant toward integrating third-party AI providers into their apps because they don’t trust their data and privacy policies, or perhaps are not at ease with U.S.-based servers powering the popular AI companies these days. It’s a legitimate concern that results in lots of potentially good app ideas being left on the table.
Once again, just like we’ve seen with the existing ChatGPT integration in Siri, Apple may be in a position to elegantly solve this by telling developers that they can have their AI cake and eat it, too. An Apple Intelligence SDK for third-party AI providers may come with a guarantee that, to protect users’ privacy, data sent from apps won’t be used for training and will be discarded soon after the request is completed. It may even offer fine-grained geo controls to automatically route users’ request to servers in appropriate locations that aren’t the United States by default. Such an abstraction layer would have Apple’s privacy seal of approval on it, likely making developers and users feel more comfortable with the idea of AI features being built into their favorite apps.
The more I think about this potential scenario, the more I’m intrigued by it. From a financial standpoint, I bet that developers who are currently selling subscriptions to justify the recurring costs of third-party AI APIs would be happy to implement a cheaper abstraction layer instead. And from a user’s perspective, sure, I wouldn’t mind having powerful AI features that Apple alone can’t provide without any of the creepiness traditionally involved.
Of course, the alternative is for Apple to bide their time, wait until they have a proper Apple Intelligence LLM to offer as an API for all kinds of features (that is, beyond summaries alone), and let third-party developers continue building primarily through other providers’ SDKs and APIs. But if the enemy of my enemy is my friend, I wouldn’t be surprised to see Apple offer something along these lines in the near future.