Posts in Linked

Macintosh Desktop Experience: No Mac Is an Island

One of the perks of a Club MacStories+ and Club Premier membership are special columns published periodically by me and John. In this week’s Macintosh Desktop Experience column, John explained how widgets in macOS Sonoma are the glue between apps and services that make the Mac feel even more like part of an integrated ecosystem of platforms and devices:

The Mac’s place in users’ computing lives has changed a lot since Steve Jobs returned to Apple and reimagined the Mac as a digital hub. Those days were marked by comparatively weak mobile phones, MP3 players, camcorders, and pocket digital cameras that benefitted from being paired with the Mac and Apple’s iLife suite.

The computing landscape is markedly different now. The constellation of gadgets surrounding the Mac in Jobs’ digital hub have all been replaced by the iPhone and iPad – powerful, portable computers in their own right. That’s been a seismic shift for the Mac. Today, the Mac is in a better place than it’s been in many years thanks to Apple silicon, but it’s no longer the center of attention. Instead, it sits alongside the iPhone and iPad as capable computing peers.

What hasn’t changed from the digital hub days is the critical role played by software. In 2001, iLife’s apps enabled the digital hub, but in 2023, the story is about widgets.

Stay until the end of the story and don’t miss the photo of John’s desk setup, which looks wild at first, but actually makes a lot of sense in the context of widgets.

Macintosh Desktop Experience is one of the many perks of a Club MacStories+ and Club Premier membership and a fantastic way to recognize the modern reality of macOS as well as get the most of your Mac thanks to John’s app recommendations, workflows, and more.

Join Club MacStories+:

Join Club Premier:

Permalink

Automation Academy: Leveraging Reminders to Make Saving Tasks to Things More Reliable On-the-Go

One of the perks of a Club MacStories+ and Club Premier membership are special columns published periodically by Federico and John. In today’s Automation Academy, which debuted a refreshed format, Federico explains how he leveraged the tight integration of Reminders and Siri with Things by Cultured Code to improve the experience of saving tasks to Things on the go.

As Federico explains:

One of the features I missed from Reminders was its deep integration with Siri and background sync privileges. Whether you’re using Siri on the iPhone or Apple Watch, you can quickly dictate a new task with natural language and rest assured you’ll find it a few seconds later on any other device signed into your iCloud account. For instance, I can’t tell you how many times I added a reminder (with dates and times) using Siri while driving via my Apple Watch and immediately found it on my iPad once I got home. You just don’t have to worry about sync if you’re using iCloud and Reminders, which is one of the most important advantages of the app.

Among other techniques, the post explains how to use ‘Repeat for Each’ blocks with magic variables and an always-on Mac running Lingon X, which is available for 20% off on the Club MacStories Discount page, to create a rock-solid way of creating new tasks from an Apple Watch or other device using Siri.

Automation Academy is one of the many perks of a Club MacStories+ and Club Premier membership and an excellent way to learn advanced Shortcuts techniques that are explained in the context of solutions to everyday problems.

Join Club MacStories+:

Join Club Premier:

Permalink

AppStories, Episode 355 – Building a Link Gathering Machine

This week on AppStories, I surprise Federico with a link gathering machine I designed for processing and reading the web.

Sponsored by:

  • CleanMyMac X – Your Mac. As good as new. Get 5% off today.

On AppStories+, Federico tests more AR glasses and reports back on his experience with the XREAL Air after a week, plus I joins in on the fun.

We deliver AppStories+ to subscribers with bonus content, ad-free, and at a high bitrate early every week.

To learn more about the benefits included with an AppStories+ subscription, visit our Plans page, or read the AppStories+ FAQ.

Permalink

Obsidian’s Popularity Explained

It’s been nearly three years since I first started using Obsidian. The app has come a long way since then. The app’s core functionality has expanded, its vibrant plug-in developer community continues to go strong, and more and more users have been captivated by its flexibility. According to Jared Newman, writing for Fast Company,

Obsidian estimates that it has one million users, and its Discord channel has more than 110,000 members, who use the app for everything from task management and bookmarking to organizing their daily thoughts.

That’s remarkable growth for an app originally developed by just two people and with a team that still stands at under a dozen members.

Newman’s story, The cult of Obsidian: Why people are obsessed with the note-taking app, does an excellent job capturing what makes Obsidian special and even attracts fans of native apps like Federico and me:

John Voorhees, the managing editor at MacStories, started using Obsidian a couple of years ago after being drawn to its local file structure, and both he and MacStories founder Federico Viticci have written extensively about their Obsidian setups since then.

Obsidian is on [sic] some ways the opposite of a quintessential MacStories app—the site often spotlights apps that are tailored exclusively for Apple platforms, whereas Obsidian is built on a web-based technology called Electron—but Voorhees says it’s his favorite writing tool regardless. He and Viticci have even commissioned some bespoke plug-ins for their Macstories [sic] workflows.

“No matter what your writing needs are, there’s probably a plug-in to satisfy them,” he says.

There are a lot of other reasons I use Obsidian, including its use of local, plain text files formatted in Markdown, but it’s the plug-in system that has made it indispensable to my work. The app simultaneously serves as my text editor, note-taking app, and database all at once, allowing me to move effortlessly among projects and tasks, thanks to the portability of plain text.

Permalink

The History of Cover Flow

A few months ago when I was writing about Widgetsmith’s new music widgets in my iOS 17 review, I told my buddy Stephen Hackett I couldn’t believe there was no Cover Flow retrospective on 512 Pixels. Yesterday, Stephen delivered:

Over the last decade or so, Apple has been hard at work in simplifying the user interfaces that power its myriad platforms. I’ve welcomed most of that work, but it’s hard to deny that we’ve all lost some things along the way.

Today, we look at a UI element that started life in iTunes, but spread to the iPod, iPhone and Mac over time: Cover Flow.

I had completely forgotten that Cover Flow eventually found its way to Safari as well. I miss Cover Flow more today than I ever used it at the time; I wonder if a similar 3D interface could be revived for the age of visionOS and Vision Pro.

Permalink

AppStories, Episode 354 – Apple Vision Pro and Apps

This week on AppStories, we spent time with the visionOS simulator and consider the design of the Apple Vision Pro system apps and what to expect from third-party developers.

Sponsored by:

  • Zocdoc – Find the right doctor, right now with Zocdoc. Sign up for free.
  • CleanMyMac X – Your Mac. As good as new. Get 5% off today.
  • Notion – Do your most efficient work with Notion AI. Try it free today.

On AppStories+, Federico dropped a big AR surprise on me.

We deliver AppStories+ to subscribers with bonus content, ad-free, and at a high bitrate early every week.

To learn more about the benefits included with an AppStories+ subscription, visit our Plans page, or read the AppStories+ FAQ.

Permalink

AppStories, Episode 353 – tvOS 17: The MacStories Review with Sigmund Judge

This week on AppStories, we are joined by tvOS expert Sigmund Judge, who just finished writing his tvOS 17 review for MacStories, to understand what has changed in tvOS and where it might be heading.

Sponsored by:

  • TV Forecast – Track, Explore and Discover Your Favorite Shows and Movies

On AppStories+, I explain iPads of a Plane, a far safer version of Snakes on a Plane.

We deliver AppStories+ to subscribers with bonus content, ad-free, and at a high bitrate early every week.

To learn more about the benefits included with an AppStories+ subscription, visit our Plans page, or read the AppStories+ FAQ.

Permalink

Apple’s Revised AirPods Pro 2 and Lossless Audio Support on Vision Pro

Soon after Apple’s Wonderlust event, it became clear that the company’s revised AirPods Pro with a USB-C case offered more than an updated connector. As detailed in a press release, the upgraded version of the second-generation AirPods Pro “unlocks powerful 20-bit, 48 kHz Lossless Audio with a massive reduction in audio latency”. But how?

Here’s Joe Rossignol, reporting at MacRumors:

In a video interview with Brian Tong, Apple’s VP of Sensing and Connectivity Ron Huang explained why only the updated second-generation AirPods Pro with a USB-C charging case support lossless audio with Apple’s upcoming Vision Pro headset.

Huang revealed that the H2 chip in the USB-C AirPods Pro supports the 5GHz band of wireless frequencies for ultra-low latency and less interference, while the H2 chip in the original second-generation AirPods Pro with a Lightning case is limited to the 2.4GHz band. Apple says it is this 5GHz support that enables the updated AirPods Pro to support lossless audio with the Vision Pro, which is slated for release in the U.S. in early 2024.

You can watch the video below:

The addition of 5GHz wireless makes complete sense in hindsight, and it doesn’t surprise me that Apple prioritized sound quality and latency reduction for a platform where full immersion is key to the experience.

Beyond Vision Pro, however, I wonder whether we’ll ever have any updates on the lossless audio front regarding Apple Music and AirPods Pro.

We know that Apple Music’s lossless catalog supports resolutions “ranging from 16-bit/44.1 kHz (CD Quality) up to 24-bit/192 kHz”. The new AirPods Pro fall short of supporting hi-res lossless playback at 24-bit/192 kHz, but so-called CD Quality lossless playback should now be within the capabilities of the device. Last time Apple gave a statement on the lack of lossless playback in AirPods Pro, they mentioned there are “other elements” to improve sound quality that aren’t necessarily about Bluetooth codecs. Is Apple waiting until they can support full 24-bit/192 kHz playback in future AirPods Pro hardware, or are there more audio-related changes coming with the launch of Vision Pro?

Permalink

iPhone 15, USB-C, and External Displays

Apple published an extensive support document about the USB-C connector on the new iPhone 15 lineup (we should be receiving our new iPhones later this week at MacStories, so stay tuned for our coverage), and a few details about compatibility with external displays caught my attention.

For starters, yes – Apple implemented DisplayPort connections over USB-C just like on the iPad Pro. The iPhone, however, is limited to a lower resolution:

iPhone uses the DisplayPort protocol to support connections to USB-C displays at up to 4K resolution and 60Hz.

Note that the latest iPad Pros support connections up to 6K, allowing you to connect an iPad Pro to a Pro Display XDR if you hate your wallet. You can try this with an iPhone 15 too, but display resolution is going to be limited to 4K. The Studio Display will be supported too, obviously.

Another tidbit from Apple’s support document:

You can connect your iPhone to an HDMI display or TV with a USB-C to HDMI adapter or cable. Adapters and cables that support HDMI 2.0 can output video from your iPhone at 4K resolution and 60Hz.

The Apple USB-C Digital AV Multiport Adapter is compatible with iPhone. This adapter can output video from iPhone at up to 4K resolution and 60Hz, including content in HDR10 or Dolby Vision if your display or TV supports HDR.

If my theory is correct, we should soon be able to connect an iPhone to an HDMI capture card (such as the ones I covered in my iPadOS 17 review) via Apple’s adapter and an HDMI cable, connect the capture card to an iPad, and use a compatible app to see the iPhone’s display on your iPad. That could be used for screencasts, playing videos from an iPhone on the iPad’s display, or, better yet, play a videogame from the iPhone in a Stage Manager window on the iPad.

The iPhone itself doesn’t support Stage Manager, so, unlike Samsung phones, it can’t be turned into a desktop workstation when plugged into an external monitor (I hope this happens down the road though). However, I do believe we’re going to start seeing some interesting experiments with iPhones being used as handheld gaming consoles with external monitors. Whether you’ll be using a capture card to turn an iPad into an external monitor for an iPhone using apps like Orion1 or Genki Studio2 or connect it to a portable OLED display, I think this newfound hardware modularity is going to be fascinating to observe.


  1. I tested the new app by the makers of Halide today shortly before it came out, and while I found its onboarding and UI delightful and the app worked well at standard resolutions, its built-in upscaling mode didn’t work for me. I tried displaying Nintendo Switch games on my iPad Pro using Orion and 4K upscaling, but the feature made games unplayable due to 3-4 seconds of added latency. I hope the Orion developers can work on a fix for this since software-based upscaling that doesn’t require a separate dongle could be a fantastic reason to use an iPad as a monitor. ↩︎
  2. This is the app that I covered as Capture Pro in my iPadOS 17 review. As it turns out, the developer teamed up with the folks at Genki (makers of the excellent Covert Dock Mini that I use with my Switch) and released the app under the name Genki Studio on the App Store this week. The functionality of the app is unchanged, and I still recommend it. ↩︎
Permalink