You’ve probably heard that Perplexity – a company whose web scraping tactics I generally despise, and the only AI bot we still block at MacStories – has rolled out an iOS version of their voice assistant that integrates with several native features of the operating system. Here’s their promo video in case you missed it:
This is a very clever idea: while other major LLMs’ voice modes are limited to having a conversation with the chatbot (with the kind of quality and conversation flow that, frankly, annihilates Siri), Perplexity put a different spin on it: they used native Apple APIs and frameworks to make conversations more actionable (some may even say “agentic”) and integrated with the Apple apps you use every day. I’ve seen a lot of people calling Perplexity’s voice assistant “what Siri should be” or arguing that Apple should consider Perplexity as an acquisition target because of this, and I thought I’d share some additional comments and notes after having played with their voice mode for a while.
The most important point about this feature is the fact that, in hindsight, this is so obvious and I’m surprised that OpenAI still hasn’t shipped the same feature for their incredibly popular ChatGPT voice mode. Perplexity’s iOS voice assistant isn’t using any “secret” tricks or hidden APIs: they’re simply integrating with existing frameworks and APIs that any third-party iOS developer can already work with. They’re leveraging EventKit for reminder/calendar event retrieval and creation; they’re using MapKit to load inline snippets of Apple Maps locations; they’re using Mail’s native compose sheet and Safari View Controller to let users send pre-filled emails or browse webpages manually; they’re integrating with MusicKit to play songs from Apple Music, provided that you have the Music app installed and an active subscription. Theoretically, there is nothing stopping Perplexity from rolling additional frameworks such as ShazamKit, Image Playground, WeatherKit, the clipboard, or even photo library access into their voice assistant. Perplexity hasn’t found a “loophole” to replicate Siri functionalities; they were just the first major AI company to do so.
That, however, doesn’t make their integration any less impressive since it perfectly illustrates why people don’t have the patience to deal with Siri’s limitations anymore. In a post-LLM world, and especially now that LLMs have voice modes, we expect to be able to have a long-form conversation with an assistant that fully understands natural language, can retain the whole context of it, and can mix and match tools such as web search, user context, and – in Perplexity’s case – app integrations to complete our requests. From this perspective, Perplexity’s voice mode does absolutely put Siri (even the version with ChatGPT integration) to shame.
To give you two examples, I asked the following questions to both Siri and Perplexity:
Can you play the song that was playing in The O.C. when Seth Cohen was wearing a Spider-Man mask and hanging upside down from the ceiling?
Can you tell me how many times iOS’ Control Center received a major redesign over the years?
Siri was downright useless, despite having ChatGPT integration enabled. For the Control Center query, it returned some unrelated results from Google search, one of them being a link to the Alexa app on the App Store (???). For the music query, it started playing a song called ‘Seth and Summer Forever’ by Babygirl. Welp.
Perplexity, on the other hand, one-shotted both queries. For the music query, I repeated the question, but added “do that on YouTube since I don’t have an Apple Music subscription” and, sure enough, it loaded a video of that scene in an embedded YouTube player. For the iOS history query, we did some back and forth to confirm all the major Control Center designs over the years, which Perplexity answered correctly, and when I asked at the end to create a reminder for me to check out all this information, it did exactly that.
You see, we’re not talking about some random generative AI slop here: we’re dealing with practical questions that, at this point in 2025, I would expect any modern AI assistant to answer reliably and quickly using a variety of tools at its disposal. You may disagree with the principles behind how these technologies were created in the first place, but I think it’s undeniable that Siri is producing absolute junk while Perplexity is being genuinely useful.
That being said, however, there’s only so much Perplexity’s voice assistant can do given platform limitations; plus, their existing integrations also need some more work.
For starters, their voice assistant often fails to add due dates for reminders due today (weird), and it cannot save reminders into specific lists of the Reminders app. I know these features are possible to implement since there’s a plethora of third-party Reminders clients that make them work, so the issues are all in Perplexity’s implementation of them. As I mentioned above, there are also several other iOS frameworks that Perplexity decided not to support in their first version of this product: technically, Perplexity could also integrate with HomeKit to control Home accessories (one of the few things I use Siri for these days), but their assistant doesn’t support this functionality. Then there are all the integrations that are exclusive to Siri, which Perplexity can’t implement because Apple doesn’t offer related developer APIs. Only Siri can run shortcuts, set timers, call App Intents, send messages, create notes, open and change device settings, and more. If you ask me, these are all prime candidates for a certain governmental body to force Apple to open up in the name of fair competition. But, hey: I’m biased.
Looking at the big picture for a second, I do think Apple is in a precarious situation here. The fact that the company makes the best computers for AI is a double-edged sword: it’s great for consumers, but those same consumers are increasingly using Apple devices as mere conduits for other companies’ AIs, funneling their data, context, and – most importantly – habits into systems that cannot be controlled by Apple. If hundreds of millions of people are getting used to having actually productive conversations with ChatGPT’s voice mode on a daily basis, what happens when OpenAI flips the switch on native iOS integrations in their iPhone app’s voice assistants if Apple doesn’t enter this space for another year?
Following Perplexity’s ingenious approach, OpenAI could do the funniest thing here. While Siri is getting some help from ChatGPT (often with poor results), OpenAI may decide to let the ChatGPT app integrate the rest of iOS. I’m convinced it’s going to happen, and I hope Siri’s new leadership has a plan in place for this possibility.