Federico Viticci

9568 posts on MacStories since April 2009

Federico is the founder and Editor-in-Chief of MacStories, where he writes about Apple with a focus on apps, developers, iPad, and iOS productivity. He founded MacStories in April 2009 and has been writing about Apple since. Federico is also the co-host of AppStories, a weekly podcast exploring the world of apps, Unwind, a fun exploration of media and more, and NPC: Next Portable Console, a show about portable gaming and the handheld revolution.

This Week's Sponsor:

Winterfest 2024

The Festival of Artisanal Software


iPad Pro for Everything: How I Rethought My Entire Workflow Around the New 11” iPad Pro

My 11" iPad Pro.

My 11” iPad Pro.

For the past two years since my girlfriend and I moved into our new apartment, my desk has been in a constant state of flux. Those who have been reading MacStories for a while know why. There were two reasons: I couldn’t figure out how to use my iPad Pro for everything I do, specifically for recording podcasts the way I like, and I couldn’t find an external monitor that would let me both work with the iPad Pro and play videogames when I wasn’t working.

This article – which has been six months in the making – is the story of how I finally did it.

Over the past six months, I completely rethought my setup around the 11” iPad Pro and a monitor that gives me the best of both worlds: a USB-C connection for when I want to work with iPadOS at my desk and multiple HDMI inputs for when I want to play my PS5 Pro or Nintendo Switch. Getting to this point has been a journey, which I have documented in detail on the MacStories Setups page.

This article started as an in-depth examination of my desk, the accessories I use, and the hardware I recommend. As I was writing it, however, I realized that it had turned into something bigger. It’s become the story of how, after more than a decade of working on the iPad, I was able to figure out how to accomplish the last remaining task in my workflow, but also how I fell in love with the 11” iPad Pro all over again thanks to its nano-texture display.

I started using the iPad as my main computer 12 years ago. Today, I am finally able to say that I can use it for everything I do on a daily basis.

Here’s how.

Read more


The Strange Case of Apple Intelligence’s iPhone-only Mail Sorting Feature

Tim Hardwick, writing for MacRumors, on a strange limitation of the Apple Intelligence rollout earlier this week:

Apple’s new Mail sorting features in iOS 18.2 are notably absent from both iPadOS 18.2 and macOS Sequoia 15.2, raising questions about the company’s rollout strategy for the email management system.

The new feature automatically sorts emails into four distinct categories: Primary, Transactions, Updates, and Promotions, with the aim of helping iPhone users better organize their inboxes. Devices that support Apple Intelligence also surface priority messages as part of the new system.

Users on iPhone who updated to iOS 18.2 have the features. However, iPad and Mac users who updated their devices with the software that Apple released concurrently with iOS 18.2 will have noticed their absence. iPhone users can easily switch between categorized and traditional list views, but iPad and Mac users are limited to the standard chronological inbox layout.

This was so odd during the beta cycle, and it continues to be the single decision I find the most perplexing in Apple’s launch strategy for Apple Intelligence.

I didn’t cover Mail’s new smart categorization feature in my story about Apple Intelligence for one simple reason: it’s not available on the device where I do most of my work, my iPad Pro. I’ve been able to test the functionality on my iPhone, and it’s good enough: iOS occasionally gets a category wrong, but (surprisingly) you can manually categorize a sender and train the system yourself.

(As an aside: can we talk about the fact that a bunch of options, including sender categorization, can only be accessed via Mail’s…Reply button? How did we end up in this situation?)

I would very much prefer to use Apple Mail instead of Spark, which offers smart inbox categorization across platforms but is nowhere as nice-looking as Mail and comes with its own set of quirks. However, as long as smart categories are exclusive to the iPhone version of Mail, Apple’s decision prevents me from incorporating the updated Mail app into my daily workflow.

Permalink

Apple Intelligence in iOS 18.2: A Deep Dive into Working with Siri and ChatGPT, Together

The ChatGPT integration in iOS 18.2.

The ChatGPT integration in iOS 18.2.

Apple is releasing iOS and iPadOS 18.2 today, and with those software updates, the company is rolling out the second wave of Apple Intelligence features as part of their previously announced roadmap that will culminate with the arrival of deeper integration between Siri and third-party apps next year.

In today’s release, users will find native integration between Siri and ChatGPT, more options in Writing Tools, a smarter Mail app with automatic message categorization, generative image creation in Image Playground, Genmoji, Visual Intelligence, and more. It’s certainly a more ambitious rollout than the somewhat disjointed debut of Apple Intelligence with iOS 18.1, and one that will garner more attention if only by virtue of Siri’s native access to OpenAI’s ChatGPT.

And yet, despite the long list of AI features in these software updates, I find myself mostly underwhelmed – if not downright annoyed – by the majority of the Apple Intelligence changes, but not for the reasons you may expect coming from me.

Some context is necessary here. As I explained in a recent episode of AppStories, I’ve embarked on a bit of a journey lately in terms of understanding the role of AI products and features in modern software. I’ve been doing a lot of research, testing, and reading about the different flavors of AI tools that we see pop up on almost a daily basis now in a rapidly changing landscape. As I discussed on the show, I’ve landed on two takeaways, at least for now:

  • I’m completely uninterested in generative products that aim to produce images, video, or text to replace human creativity and input. I find products that create fake “art” sloppy, distasteful, and objectively harmful for humankind because they aim to replace the creative process with a thoughtless approximation of what it means to be creative and express one’s feelings, culture, and craft through genuine, meaningful creative work.
  • I’m deeply interested in the idea of assistive and agentic AI as a means to remove busywork from people’s lives and, well, assist people in the creative process. In my opinion, this is where the more intriguing parts of the modern AI industry lie:
    • agents that can perform boring tasks for humans with a higher degree of precision and faster output;
    • coding assistants to put software in the hands of more people and allow programmers to tackle higher-level tasks;
    • RAG-infused assistive tools that can help academics and researchers; and
    • protocols that can map an LLM to external data sources such as Claude’s Model Context Protocol.

I see these tools as a natural evolution of automation and, as you can guess, that has inevitably caught my interest. The implications for the Accessibility community in this field are also something we should keep in mind.

To put it more simply, I think empowering LLMs to be “creative” with the goal of displacing artists is a mistake, and also a distraction – a glossy facade largely amounting to a party trick that gets boring fast and misses the bigger picture of how these AI tools may practically help us in the workplace, healthcare, biology, and other industries.

This is how I approached my tests with Apple Intelligence in iOS and iPadOS 18.2. For the past month, I’ve extensively used Claude to assist me with the making of advanced shortcuts, used ChatGPT’s search feature as a Google replacement, indexed the archive of my iOS reviews with NotebookLM, relied on Zapier’s Copilot to more quickly spin up web automations, and used both Sonnet 3.5 and GPT-4o to rethink my Obsidian templating system and note-taking workflow. I’ve used AI tools for real, meaningful work that revolved around me – the creative person – doing the actual work and letting software assist me. And at the same time, I tried to add Apple’s new AI features to the mix.

Perhaps it’s not “fair” to compare Apple’s newfangled efforts to products by companies that have been iterating on their LLMs and related services for the past five years, but when the biggest tech company in the world makes bold claims about their entrance into the AI space, we have to take them at face value.

It’s been an interesting exercise to see how far behind Apple is compared to OpenAI and Anthropic in terms of the sheer capabilities of their respective assistants; at the same time, I believe Apple has some serious advantages in the long term as the platform owner, with untapped potential for integrating AI more deeply within the OS and apps in a way that other AI companies won’t be able to. There are parts of Apple Intelligence in 18.2 that hint at much bigger things to come in the future that I find exciting, as well as features available today that I’ve found useful and, occasionally, even surprising.

With this context in mind, in this story you won’t see any coverage of Image Playground and Image Wand, which I believe are ridiculously primitive and perfect examples of why Apple may think they’re two years behind their competitors. Image Playground in particular produces “illustrations” that you’d be kind to call abominations; they remind me of the worst Midjourney creations from 2022. Instead, I will focus on the more assistive aspects of AI and share my experience with trying to get work done using Apple Intelligence on my iPhone and iPad alongside its integration with ChatGPT, which is the marquee addition of this release.

Let’s dive in.

Read more


Apple Frames 3.3 Adds Support for iPhone 16 and 16 Pro, M4 iPad Pro, and Apple Watch Series 10 (feat. An Unexpected Technical Detour)

Apple Frames 3.3 supports all the new devices released by Apple in 2024.

Apple Frames 3.3 supports all the new devices released by Apple in 2024.

Well, this certainly took longer than expected.

Today, I’m happy to finally release version 3.3 of Apple Frames, my shortcut to put screenshots inside physical frames of Apple devices. In this new version, which is a free update for everyone, you’ll find support for all the new devices Apple released in 2024:

  • 11” and 13” M4 iPad Pro
  • iPhone 16 and iPhone 16 Pro lineup
  • 42mm and 46mm Apple Watch Series 10

To get started with Apple Frames, simply head to the end of this post (or search for Apple Frames in the MacStories Shortcuts Archive), download the updated shortcut, and replace any older version you may have installed with it. The first time you run the shortcut, you’ll be asked to redownload the file assets necessary for Apple Frames, which is a one-time operation. Once that’s done, you can resume framing your screenshots like you’ve always done, either using the native Apple Frames menu or the advanced API that I introduced last year.

So what took this update so long? Well, if you want to know the backstory, keep on reading.

Read more


A Feature from 10 Years Ago Is Back – with a Twist – in My Favorite RSS Client

Unread's new custom shortcuts.

Unread’s new custom shortcuts.

When it comes to productivity apps, especially those that have to work within the constraints of iOS and iPadOS, it’s rare these days to stumble upon a new idea that has never been tried before. With the exception of objectively new technologies such as LLMs, or unless there’s a new framework that Apple is opening up to developers, it can often feel like most ideas have been attempted before and we’re simply retreading old ground.

Let me be clear: I don’t think there’s anything inherently wrong with that. I’ve been writing about iPhone and iPad apps for over a decade now, and I believe there are dozens of design patterns and features that have undeservedly fallen out of fashion. But such is life.

Today marks the return of a very MacStories-y feature in one of my longtime favorite apps, which – thanks to this new functionality – is gaining a permanent spot on my Home Screen. Namely, the RSS client Unread now lets you create custom article actions powered by the Shortcuts app.

Read more


Denim Adds Direct Spotify Integration to Customize Playlist Artwork

Denim's Spotify integration.

Denim’s Spotify integration.

I don’t remember exactly when I started using Denim, but it was years ago, and I was looking for a way to spruce up the covers of my playlists. I was using Apple Music at the time, and it was before Apple added basic playlist cover generation features to the Music app. Even after that feature came to Music, Denim still provided more options in terms of colors, fonts, and patterns. Earlier this year, I covered its 3.0 update with the ability to automatically recognize artists featured in playlists for Club members here.

I switched to Spotify months ago (and haven’t looked back since; music discovery is still leagues ahead of Apple Music), and I was very happy to see recently that Denim can now integrate with Spotify directly, without the need to save covers to the Photos app first. Essentially, once you’ve logged in with your Spotify account, the app is connected to your library with access to your playlists. You can pick an existing playlist directly from Denim, customize its cover, and save it back to your Spotify account without opening the Spotify app or having to save an image file upfront.

That’s possible thanks to Spotify’s web-based API for third-party apps, which allows a utility like Denim to simplify the creation flow of custom covers down to a couple of taps. In a nice touch, once a playlist cover has been saved to Spotify, the app lets you know with haptic feedback and allows you to immediately view the updated cover in Spotify, should you want to double-check the results in the context of the app.

The combination of this fast customization process for Spotify and new artwork options added in this release only cements Denim’s role as the best utility for people who care about the looks of the playlists they share with friends and family. Denim is available on the App Store for free, with both a lifetime purchase ($19.99) and annual subscription ($4.99) available to unlock its full feature set.


iPod Fans Are Trying to Preserve Lost Click Wheel Games

I last wrote about iPod click wheel games here on MacStories in…2011, when Apple officially delisted them from the iTunes Store. Thirteen years later, some enterprising iPod fans are trying to preserve those games and find a way to let other old-school iPod fans play them today.

Here’s Kyle Orland, writing at Ars Technica:

In recent years, a Reddit user going by the handle Quix used this workaround to amass a local library of 19 clickwheel iPod games and publicly offered to share “copies of these games onto as many iPods as I can.” But Quix’s effort ran into a significant bottleneck of physical access—syncing his game library to a new iPod meant going through the costly and time-consuming process of shipping the device so it could be plugged into Quix’s actual computer and then sending it back to its original owner.

Enter Reddit user Olsro, who earlier this month started the appropriately named iPod Clickwheel Games Preservation Project. Rather than creating his master library of authorized iTunes games on a local computer in his native France, Olsro sought to “build a communitarian virtual machine that anyone can use to sync auth[orized] clickwheel games into their iPod.” While the process doesn’t require shipping, it does necessitate jumping through a few hoops to get the Qemu Virtual Machine running on your local computer.

Olsro’s project is available here, and it includes instructions on how to set up the virtual machine so you can install the games yourself. Did you know that, for example, Square Enix made two iPod games, Crystal Defenders and Song Summoner? Without these fan-made projects, all of these games would be lost to time and link rot – and we unfortunately know why.

Permalink

You Can Use Clean Up with a Clear Conscience

I enjoyed this take on Apple Intelligence’s Clean Up feature by Joe Rosensteel, writing for Six Colors last week:

The photographs you take are not courtroom evidence. They’re not historical documents. Well, they could be, but mostly they’re images to remember a moment or share that moment with other people. If someone rear-ended your car and you’re taking photos for the insurance company, then that is not the time to use Clean Up to get rid of people in the background, of course. Use common sense.

Clean Up is a fairly conservative photo editing tool in comparison to what other companies offer. Sometimes, people like to apply a uniform narrative that Silicon Valley companies are all destroying reality equally in the quest for AI dominance, but that just doesn’t suit this tool that lets you remove some distractions from your image.

It’s easy to get swept up in the “But what is a photo” philosophical debate (which I think raises a lot of interesting points), but I agree with Joe: we should also keep in mind that, sometimes, we’re just removing that random tourist from the background and our edit isn’t going to change the course of humankind’s history.

Also worth remembering:

For some reason, even the most literal of literal people is fine with composing a shot to not include things. To even (gasp!) crop things out of photos. You can absolutely change meaning and context just as much through framing and cropping as you can with a tool like Clean Up. No one is suggesting that the crop tool be removed or that we should only be allowed to take the widest wide-angle photographs possible to include all context at all times, like security camera footage.

Permalink

iPad mini Review: The Third Place

The new iPad mini.

The new iPad mini.

My first reaction when I picked up the new iPad mini last Thursday morning was that it felt heavier than my 11” iPad Pro. Obviously, that was not the case – it’s nearly 150 grams lighter, in fact. But after several months of intense usage of the new, incredibly thin iPad Pro, the different weight distribution and the thicker form factor of the iPad mini got me for a second. Despite being “new”, compared to the latest-generation iPad Pro, the iPad mini felt old.

The second thing I noticed is that, color aside, the new iPad mini looks and feels exactly like the sixth-generation model I reviewed here on MacStories three years ago. The size is the same, down to the millimeter. The weight is the same. The display technology is the same. Three minor visual details give the “new” iPad mini away: it says “iPad mini” on the back, it’s called “iPad mini (A17 Pro)” on the box, and it’s even called “iPad mini (A17 Pro)” (and not “iPad mini (7th generation)”) in Settings ⇾ General ⇾ About.

I’m spending time on these minor, largely inconsequential details because I don’t know how else to put it: this iPad mini is pretty much the same iPad I already reviewed in 2021. The iPadOS experience is unchanged. You still cannot use Stage Manager on any iPad mini (not even when docked), and the classic Split View/Slide Over environment is passable, but more constrained than on an iPad Air or Pro. I covered all these aspects of the mini experience in 2021; everything still holds true today.

What matters today, however, is what’s inside. The iPad mini with A17 Pro is an iPad mini that supports Apple Intelligence, the Apple Pencil Pro, and faster Wi-Fi. And while the display technology is unchanged – it’s an IPS display that refreshes at 60 Hz – the so-called jelly scrolling issue has been fixed thanks to an optimized display controller.

As someone who lives in Italy and cannot access Apple Intelligence, that leaves me with an iPad mini that is only marginally different from the previous one, with software features coming soon that I won’t be able to use for a while. It leaves me with a device that comes in a blue color that isn’t nearly as fun as the one on my iPhone 16 Plus and feels chunkier than my iPad Pro while offering fewer options in terms of accessories (no Magic Keyboard) and software modularity (no Stage Manager on an external display).

And yet, despite the strange nature of this beast and its shortcomings, I’ve found myself in a similar spot to three years ago: I don’t need this iPad mini in my life, but I want to use it under very specific circumstances.

Only this time, I’ve realized why.

Read more