This Week's Sponsor:

Rogue Amoeba

Turn Your Mac Into an Audio Powerhouse


Posts in stories

The iPad’s “Sweet” Solution

In working with my iPad Pro over the past few months, I’ve realized something that might have seemed absurd just a few years ago: some of the best apps I’m using – the ones with truly desktop-class layouts and experiences – aren’t native iPad apps.

They’re web apps.

Before I continue and share some examples, let me clarify that this is not a story about the superiority of one way of building software over another. I’ll leave that argument to developers and technically inclined folks who know much more about programming and software stacks than I do.

Rather, the point I’m trying to make is that, due to a combination of cost-saving measures by tech companies, Apple’s App Store policies over the years, and the steady rise of a generation of young coders who are increasingly turning to the web to share their projects, some of the best, most efficient workflows I can access on iPadOS are available via web apps in a browser or a PWA.

Read more


On Apple Offering an Abstraction Layer for AI on Its Platforms

Source: Apple.

Source: Apple.

I’ve been thinking about Apple’s position in AI a lot this week, and I keep coming back to this idea: if Apple is making the best consumer-grade computers for AI right now, but Apple Intelligence is failing third-party developers with a lack of AI-related APIs, should the company try something else to make it easier for developers to integrate AI into their apps?

Gus Mueller, creator of Acorn and Retrobatch, has been pondering similar thoughts:

A week or so ago I was grousing to some friends that Apple needs to open up things on the Mac so other LLMs can step in where Siri is failing. In theory we (developers) could do this today, but I would love to see a blessed system where Apple provided APIs to other LLM providers.

Are there security concerns? Yes, of course there are, there always will be. But I would like the choice.

The crux of the issue in my mind is this: Apple has a lot of good ideas, but they don’t have a monopoly on them. I would like some other folks to come in and try their ideas out. I would like things to advance at the pace of the industry, and not Apple’s. Maybe with a blessed system in place, Apple could watch and see how people use LLMs and other generative models (instead of giving us Genmoji that look like something Fisher-Price would make). And maybe open up the existing Apple-only models to developers. There are locally installed image processing models that I would love to take advantage of in my apps.

The idea is a fascinating one: if Apple Intelligence cannot compete with the likes of ChatGPT or Claude for the foreseeable future, but third-party developers are creating apps based on those APIs, is there a scenario in which Apple may regain control of the burgeoning AI app ecosystem by offering their own native bridge to those APIs?

Read more


The ‘e’ Is for Elemental

Source: Apple.

Source: Apple.

For the past 10 days, I’ve been testing the iPhone 16e – but not in the way I typically test new hardware. You see, I didn’t buy the iPhone 16e to make calls, send email, surf the web, post to social media, or anything else, really. Instead, I got it for one thing: the camera.

Read more


Notes on the Apple Intelligence Delay

Simon Willison, one of the more authoritative independent voices in the LLM space right now, published a good theory on what may have happened with Apple’s delay of Apple Intelligence’s Siri personalization features:

I have a hunch that this delay might relate to security.

These new Apple Intelligence features involve Siri responding to requests to access information in applications and then perform actions on the user’s behalf.

This is the worst possible combination for prompt injection attacks! Any time an LLM-based system has access to private data, tools it can call and potentially malicious instructions (like emails and text messages from untrusted strangers) there’s a risk that an attacker might subvert those tools and use them to damage or exfiltration a user’s data.

Willison has been writing about prompt injection attacks since 2023. We know that Mail’s AI summaries were (at least initially?) sort of susceptible to prompt injections (using hidden HTML elements), as were Writing Tools during the beta period. It’s scary to imagine what would happen with a well-crafted prompt injection when the attack’s surface area becomes the entire assistant directly plugged into your favorite apps with your data. But then again, one has to wonder why these features were demoed at all at Apple’s biggest software event last year and if those previews – absent a real, in-person event – were actually animated prototypes.

On this note, I disagree with Jason Snell’s idea that previewing Apple Intelligence last year was a good move no matter what. Are we sure that “nobody is looking” at Apple’s position in the AI space right now and that Siri isn’t continuing down its path of damaging Apple’s software reputation, like MobileMe did? As a reminder, the iPhone 16 lineup was advertised as “built for Apple Intelligence” in commercials, interviews, and Apple’s website.

If the company’s executives are so certain that the 2024 marketing blitz worked, why are they pulling Apple Intelligence ads from YouTube when “nobody is looking”?

On another security note: knowing Apple’s penchant for user permission prompts (Shortcuts and macOS are the worst offenders), I wouldn’t be surprised if the company tried to mitigate Siri’s potential hallucinations and/or the risk of prompt injections with permission dialogs everywhere, and later realized the experience was terrible. Remember: Apple announced an App Intents-driven system with assistant schemas that included actions for your web browser, file manager, camera, and more. Getting any of those actions wrong (think: worse than not picking your mom up at the airport, but actually deleting some of your documents) could have pretty disastrous consequences.

Regardless of what happened, here’s the kicker: according to Mark Gurman, “some within Apple’s AI division” believe that the delayed Apple Intelligence features may be scrapped altogether and replaced by a new system rebuilt from scratch. From his story, pay close attention to this paragraph:

There are also concerns internally that fixing Siri will require having more powerful AI models run on Apple’s devices. That could strain the hardware, meaning Apple either has to reduce its set of features or make the models run more slowly on current or older devices. It would also require upping the hardware capabilities of future products to make the features run at full strength.

Inference costs may have gone down over the past 12 months and context windows may have gotten bigger, but I’m guessing there’s only so much you can do locally with 8 GB of RAM when you have to draw on the user’s personal context across (potentially) dozens of different apps, and then have conversations with the user about those results. It’ll be interesting to watch what Apple does here within the next 1-2 years: more RAM for the same price on iPhones, even more tasks handed off to Private Cloud Compute, or a combination of both?

We’ll see how this will play out at WWDC 2025 and beyond. I continue to think that Apple and Google have the most exciting takes on AI in terms of applying the technology to user’s phones and apps they use everyday. The only difference is that one company’s announcements were theoretical, and the other’s are shipping today. It seems clear now that Apple got caught off guard by LLMs while they were going down the Vision Pro path, and I’ll be curious to see how their marketing strategy will play out in the coming months.


From a Turntable to an iPad Home Dashboard: My First Experience with Vinyl

This month, amidst the increasingly chaotic rumblings of the world in the news, I found myself looking for a new distraction. Since Civilization VII wasn’t out just yet, my eyes quickly turned to the vinyl record player that my partner and I had been storing in its cardboard box for months without ever taking the time to set it up. It’s not even ours; we’ve been keeping it safe, along with a sizable vinyl collection, for a close friend who unfortunately doesn’t have enough space for it in their current home.

This turntable is definitely not fancy – it’s even quite affordable compared to similar models – but it looks pretty, and our friend gracefully gave us permission to set it up for ourselves in the living room. While I’m sure they only pitied my desperate need for a new distraction, I took them up on this offer and opened the turntable’s box for the first time.

At the risk of sounding like a total youngster, I must disclose that until three weeks ago, I had never interacted with vinyl before. All I had were some presumptuous preconceptions. ”Doesn’t music sound worse on vinyl? Also, why should I bother with large, fragile music discs and a whole record player when I already have Apple Music in my pocket with lossless audio and Dolby Atmos?”

Still, I was not only intrigued, but also motivated to solve the main problem that setting up this record player posed: how can I make it work when our audio gear at home consists only of a handful of HomePod minis, one pair of wired headphones, and several pairs of Bluetooth headphones? While some turntables ship with built-in Bluetooth connectivity, ours can only output audio over USB or RCA with the help of a sound amplifier, and it definitely can’t broadcast audio to AirPlay devices like our HomePod minis.

Allow me to spoil the ending of this story for you: in the end, unboxing this turntable escalated into a legitimately awesome tech upgrade to our living room. It’s now equipped with a docked 11“ iPad Pro that acts as a shared dashboard for controlling our HomeKit devices, performing everyday tasks like consulting the weather and setting up timers, and of course, broadcasting our vinyls to any HomePod mini or Bluetooth device in the apartment. This setup is amazing, and it works perfectly; however, getting there was a tedious process that drastically reinforced my long-standing frustrations with Apple’s self-imposed software limitations.

Let me tell you how it all went.

Read more


DefaultSMS Lets You Choose Your Default Messaging App

In iOS 18.2, Apple introduced the ability for users to set their default apps for messaging, calling, call filtering, passwords, contactless payments, and keyboards. Previously, it was only possible to specify default apps for mail and browsing, so this was a big step forward.

While apps like 1Password quickly took advantage of these new changes, there have been few to no takers in the calling, contactless payments, and messaging categories. Enter DefaultSMS, a new app that, as far as I can tell, seems to be the first to make use of the default messaging app setting.

Default SMS is not a messaging app. What it does is use this new setting to effectively bounce the user into the messaging app of their choice when they tap on a phone number elsewhere within iOS. Telegram, WhatsApp, and Signal are the options currently supported in the app.

Initial setup is quick. First, you select the messaging app you would like to use within DefaultSMS. Then, you head to Settings → Apps → Default Apps → Messaging and select DefaultSMS instead of Messages.

When you tap on a phone number, DefaultSMS will launch your chosen messaging app to start a conversation.

When you tap on a phone number, DefaultSMS will launch your chosen messaging app to start a conversation.

Now, whenever you tap on a phone number from a website, email, note, or other source within iOS, the system will recognize the sms:// link and open a new message to that number in your default messaging app, now specified as DefaultSMS. The app will then bounce you into your messaging app of choice to start the conversation. The developer says the process is 100% private, with DefaultSMS retaining none of this information.

It’s worth pointing out a few things about the app:

  • You can only message someone who already has the app you are messaging from.
  • If someone sends you an SMS, it will still be delivered to the Messages app.
  • Once you start a conversation, you will be messaging from the app you have chosen (such as WhatsApp), not via SMS.

So why does this app exist? I put this question to the developer, Vincent Neo, who said, “The focus of the app is more towards countries where a significant part of the population already prefers a specific platform very frequently, such that users are very likely to prefer that over other platforms (including SMS), similar to your case, where everyone you know has WhatsApp.”

Quite simply, DefaultSMS allows you to choose which app you want to use to start a conversation when you tap a phone number, rather than always reverting to Messages. The app also highlights a flaw in the phrase “default messaging app”: there are still no APIs for apps to receive SMS messages. Until those are added, we will have to rely on clever third-party utilities like DefaultSMS to get us halfway there.

DefaultSMS is available on the App Store for $0.99.


Gemini 2.0 and LLMs Integrated with Apps

Busy day at Google today: the company rolled out version 2.0 of its Gemini AI assistant (previously announced in December) with a variety of new and updated models to more users. From the Google blog:

Today, we’re making the updated Gemini 2.0 Flash generally available via the Gemini API in Google AI Studio and Vertex AI. Developers can now build production applications with 2.0 Flash.

We’re also releasing an experimental version of Gemini 2.0 Pro, our best model yet for coding performance and complex prompts. It is available in Google AI Studio and Vertex AI, and in the Gemini app for Gemini Advanced users.

We’re releasing a new model, Gemini 2.0 Flash-Lite, our most cost-efficient model yet, in public preview in Google AI Studio and Vertex AI.

Finally, 2.0 Flash Thinking Experimental will be available to Gemini app users in the model dropdown on desktop and mobile.

Read more


The Many Purposes of Timeline Apps for the Open Web

Tapestry (left) and Reeder.

Tapestry (left) and Reeder.

Writing at The Verge following the release of The Iconfactory’s new app Tapestry, David Pierce perfectly encapsulates how I feel about the idea of “timeline apps” (a name that I’m totally going to steal, thanks David):

⁠⁠What I like even more, though, is the idea behind Tapestry. There’s actually a whole genre of apps like this one, which I’ve taken to calling “timeline apps.” So far, in addition to Tapestry, there’s ReederUnreadFeeeedSurf, and a few others. They all have slightly different interface and feature ideas, but they all have the same basic premise: that pretty much everything on the internet is just feeds. And that you might want a better place to read them.⁠⁠
[…]
These apps can also take some getting used to. If you’re coming from an RSS reader, where everything has the same format — headline, image, intro, link — a timeline app will look hopelessly chaotic. If you’re coming from social, where everything moves impossibly fast and there’s more to see every time you pull to refresh, the timeline you curate is guaranteed to feel boring by comparison.⁠⁠

I have a somewhat peculiar stance on this new breed of timeline apps, and since I’ve never written about them on MacStories before, allow me to clarify and share some recent developments in my workflow while I’m at it.

Read more


Six Colors’ Apple in 2024 Report Card

Average scores from the 2024 Six Colors report card. Source: [Six Colors](https://sixcolors.com/post/2025/02/apple-in-2024-the-six-colors-report-card/).

Average scores from the 2024 Six Colors report card. Source: Six Colors.

For the past 10 years, Six Colors’ Jason Snell has put together an “Apple report card” – a survey to assess the current state of Apple “as seen through the eyes of writers, editors, developers, podcasters, and other people who spend an awful lot of time thinking about Apple”.

The 2024 edition of the Six Colors Apple Report Card has been published, and you can find an excellent summary of all the submitted comments along with charts featuring average scores for the different categories here.

I’m grateful that Jason invited me to take part again and share my thoughts on Apple’s 2024. As you’ll see from my comments below, last year represented the end of an interesting transition period for me: after years of experiments, I settled on the iPad Pro as my main computer. Despite my personal enthusiasm, however, the overall iPad story remained frustrating with its peculiar mix of phenomenal M4 hardware and stagnant software. The iPhone lineup impressed me with its hardware (across all models), though I’m still wishing for that elusive foldable form factor. I was very surprised by the AirPods 4, and while Vision Pro initially showed incredible promise, I found myself not using it that much by the end of the year.

I’ve prepared the full text of my responses for the Six Colors report card, which you can find below.

Read more