This Week's Sponsor:

Winterfest 2024

The Festival of Artisanal Software


Posts in Linked

Apple Opens the 2024 Swift Student Challenge to Submissions

Source: Apple.

Source: Apple.

Apple has opened up submissions for the 2024 Swift Student Challenge. As we reported last November, the format of this year’s format is a little different than in the past. Eligible students have from today through February 25, 2024, to submit their app playgrounds on a topic of their choosing.

Apple will choose 350 winners from the entries submitted and name 50 of those Distinguished Winners who will be invited by Apple to visit its Cupertino headquarters. All winners will receive a one-year membership in the Apple Developer Program, a voucher to take an App Development with Swift certification exam, and a special gift.

Two of my kids participated in the Swift Student Challenge in the past. If you know a student who’s interested in learning to code, I know from experience that this is a great way to get them started and excited about the subject.

Permalink

iFixit Disassembles Apple Vision Pro

Source: [iFixit](https://www.ifixit.com/News/90137/vision-pro-teardown-why-those-fake-eyes-look-so-weird).

Source: iFixit.

As with myriad other Apple gadgets, the folks at iFixit have pulled apart a Vision Pro to see what makes it work. There’s a lot of tech crammed into a relatively small space, which made the Vision Pro difficult to take apart. But, after heating and prying parts apart, removing brackets, screws, and cables, iFixit made it reached the inner layers to show off all of the Vision Pro’s components, concluding that:

The Vision Pro is insanely ambitious. Yes, it’s heavy, and the glass is fragile, and that tethered battery might get annoying. But Apple has managed to pack the power of a Mac, plus the performance of a new dedicated AR chip, into a computer that you can wear on your face.

Repairability-wise, it’s not great, but on the plus side, some of the connections are quite delightful. You should have seen our teardown team jump up when they realized that the side arms could be popped out using the SIM-removal tool, for example, and the magnetic cushions are yet more user-friendly.

To see the Vision Pro pulled apart step by step, don’t miss the companion video on YouTube:

Permalink

Wallpaper Interviews Apple’s Alan Dye and Richard Howarth

Today, Wallpaper published an interview with Alan Dye, Apple’s Vice President of Human Interface Design, and Richard Howarth, Vice President of Industrial Design. It’s a fantastic read with some great images, including an exploded view of the Vision Pro’s components.

Something I noticed as soon as I unboxed the Apple Vision Pro was how approachable it was. The setup process was easy, well-paced, and felt natural. That carries through to the hardware itself, too, which Dye explained to Wallpaper:

We wanted people around you to also feel comfortable with you wearing it, and for you to feel comfortable wearing it around other people. That’s why we spent years designing a set of very natural, comfortable gestures that you can use without waving your hands in the air. That’s also why we developed EyeSight, because we knew more than anything, if we were going to cover your eyes, that takes away much of what is possible when you connect with people. Getting that right was at the core of the concept of the product because we wanted people to retain those connections in their actual world.

My very early impression is that Apple’s design team accomplished its goal. Howarth puts a slightly different spin on the same message:

There’s a hardness and precision to the front of the product that is completely technical and feels like it’s been sent from the future, but then everything else that connects the product to you is soft and really approachable, so you feel cushioned and there’s not a barrier to putting it on or taking it off. And in fact, it should be a pleasure.’

Nobody is going to confuse the Vision Pro for something that it’s not. Still, the care that has been taken in its design goes a long way toward taking a device that is completely foreign to many people and making it one that isn’t intimidating. That’s something very uniquely Apple and why I’m optimistic about Vision Pro’s long-term prospects.

Permalink

On Vision Pro’s Spatial Computing

There’s no tech commentator better equipped to talk about the history of spatial interfaces in Apple operating systems than John Siracusa, and I enjoyed his latest, thought-provoking column on where visionOS and the Vision Pro’s gesture system fit in the spatial computing world:

Where Vision Pro may stumble is in its interface to the deep, spatial world it provides. We all know how to reach out and “directly manipulate” objects in the real world, but that’s not what Vision Pro asks us to do. Instead, Vision Pro requires us to first look at the thing we want to manipulate, and then perform an “indirect” gesture with our hands to operate on it.

Is this look-then-gesture interaction any different than using a mouse to “indirectly” manipulate a pointer? Does it leverage our innate spatial abilities to the same extent? Time will tell. But I feel comfortable saying that, in some ways, this kind of Vision Pro interaction is less “direct” than the iPhone’s touch interface, where we see a thing on a screen and then literally place our fingers on it. Will there be any interaction on the Vision Pro that’s as intuitive, efficient, and satisfying as flick-scrolling on an iPhone screen? It’s a high bar to clear, that’s for sure.

In yesterday’s review on The Verge, Nilay Patel shared a similar idea: it’s a strange feeling to use a computer that requires you to look at what you want to control at all times. I don’t know what to think about this yet since I don’t have a Vision Pro, but I’m curious to learn how this interaction method will scale over time as we start using this new platform on a daily basis. It’s quite fitting, however, that visionOS is based on the one Apple platform that supports both kinds of manipulation with a pointer and touch.

Permalink

AppStories, Episode 368 – Workflow Experiments

This week on AppStories, we explore workflows we’ve been trying in a variety of apps.

Sponsored by:

  • Memberful - Help your clients monetize their passion. Get started for free today!

On AppStories+, Federico and John follow up last week’s Vision Pro episode with follow up on their plans and consider what the device will be good for besides sitting by yourself watching movies.

We deliver AppStories+ to subscribers with bonus content, ad-free, and at a high bitrate early every week.

To learn more about the benefits included with an AppStories+ subscription, visit our Plans page, or read the AppStories+ FAQ.

Read more


QuickTime VR and Spatial Computing

Source: Tabletops.

Source: Tabletops.

Soon, the world will get a glimpse of the Apple Vision Pro outside of the tightly controlled demos provided to a select number of people. As we wait for that moment, it’s worth taking a look back at QuickTime VR with Michael Steeber through his excellent newsletter, Tabletops.

QuickTime VR was a 3D image format that explored some of the spatial video concepts coming to the Vision Pro, albeit using CD-ROMs in 90s-era Macs. To show off the technology, Apple created a demo CD that included a virtual tour of the recently closed Company Store. Steeber got the tour up and running and shared some wonderful images and videos of Apple’s vision for VR 30 years ago.

The story is full of interesting details about Apple retail when the Company Store was all there was to Apple retail:

At the entrance to the store is a physical map of the space, like the kind you’d find at a trailhead or in a museum lobby. In the Performa department, a cutout of a child hanging upside down looms from the ceiling. Along the wall is a disheveled pile of AppleDesign Speaker boxes. In the Newton department, an entire wall is wrapped with a print of someone’s backside, toting a Newton in their jeans pocket.

One section of the store is filled with more than 700 software titles. In early promotional materials, Apple called this aisle “Technology Way,” which is so similar to the “Software Alley” in early Apple Stores that I can’t help but wonder if it was carried over.

Be sure to check out the latest issue of Tabletops to see QuickTime VR in all its glory and sign up for the newsletter while you’re there. It’s always a good read.

Permalink

Add Timestamp Links to Apple Podcasts Next

Matthew Cassinelli:

Yesterday, Apple began adding transcripts to Apple Podcasts, detailing the change on the Apple Podcast for Creators site and making them available for in iOS 17.4 developer beta 1.

This change is a huge win for accessibility, will surely improve searching in the Podcasts app, and makes quoting your favorite podcast an easy task by letting you copy and paste the text out – something I’ll definitely have to turn into a shortcut soon.

All these benefits are great in their own way and will make podcasts more shareable as a whole, allowing us to unlock so many people’s great ideas that are currently stored within hours of audio files and obscured behind URLs that point only to the show or episode as a whole.

However, I think Apple needs to go one step further in their next step and add timestamps to Apple Podcasts, a long-overdue feature that’d enable users to share links to individual moments within a podcast, directly to a specific point in the transcript.

I couldn’t agree more. From sharing to personal note-taking and research purposes, there several use cases I can think of to take advantage of timestamp links for podcast episodes – especially now that they have transcripts. (Pocket Casts, my favorite third-party podcast player, goes even further: it lets you share timestamp links and save private, time-synced bookmarks for specific parts of any episode.)

I like Matthew’s suggestions for how Apple could implement this feature, and I’ll add: Apple has already built this system for the Music app. When the company added shareable lyrics to the Music app in iOS 14.5, they did so with the ability to share selected lyrics as a special “snippet” on iMessage that is actually an interactive, timestamped song preview based on a special URL. Here’s what I wrote:

Besides Apple’s custom implementation of lyrics selection in the share sheet, what’s also interesting about this is the method the company is using to share Apple Music lyrics URLs. Unlike regular music.apple.com links that reopen a particular song or album in the Music app or play a generic preview snippet in iMessage, lyrics URLs are timestamped: in iMessage, the lyrics card has a play button that will preview the lyrics you shared inline within a conversation; if you tap the link in iMessage and the same song is already paused in the Music app, the Now Playing screen will automatically advance to the section highlighted in shared lyrics.

I’m assuming that Apple is aware of this missing feature from the Podcasts app in iOS 17.4 beta 1; I have to believe their future implementation will be very similar to what already exists in Music.

Permalink

Obsidian’s ‘2023 Gems of the Year’

Silver, writing on the Obsidian blog:

It has been nearly four years since the first line of code of Obsidian was written on January 31st, 2020. Today we’re thrilled to announce the winners of our fourth annual Gems of the Year awards!

This year the Obsidian community nominated 287 projects, including plugins, themes, tools, content, and templates. After our panel narrowed down the selection and the community voted on the entries, we’re now excited to announce the winners.

Solid list of plugins and themes for the best note-taking app out there, many of which I wasn’t familiar with or hadn’t tested yet. The Border theme looks stunning and I’m going to give it a try as my primary theme in the app; the Home Tab plugin does exactly what I want from a new empty tab in Obsidian (a search box + recently opened files); Omnivore, which I’m testing as my read-later app after they added better Shortcuts integration, does a fantastic job syncing highlights to Obsidian with its plugin. Go check out this list of gems if you haven’t yet.

Side note: I’m really curious to see how the Obsidian team prioritizes updates to its iPhone and iPad apps (by far, the weakest spot of the app) in 2024.

Permalink

Apple Shares the Secret of Why the 40-Year-Old Mac Still Rules

Steven Levy, writing for Wired, interviewed Apple executives about the secret to the Mac’s 40-year run:

“With the transition to Apple silicon that we started in 2020, the experience of using a Mac was unlike anything before that,” says John Ternus, Apple’s senior vice president of hardware engineering.

Ternus’ comment opens up an unexpected theme to our conversation: how the connections between the Mac and Apple’s other breakout products have continually revitalized the company’s PC workhorse. As a result, the Mac has stayed relevant and influential way past the normal lifespan of a computer product.

In the past few years, Mac innovations sprang from the transition to custom Apple silicon chips first pioneered to power iPhones. “I joke that we had to create the iPhone to create the scale to build the Mac we wanted to build,” says Craig Federighi, Apple’s senior vice president of software engineering. Ternus also notes that the iPhone’s contribution to Apple’s bottom line has been very good to the Mac. “As the business has been successful, it’s enabled us to invest and do the things we always wanted to do,” he says.

One example of that, I mention, must have been the recent boost to battery life in Mac notebooks. “When we broke physics?” jokes Joswiak. Indeed, the almost daylong span, 22 hours of battery life in some Macbook Pros, can feel life-changing. Again, this was a collateral effect of efforts to extend battery life in the iPhone.

“When we first started working with Apple silicon, it honestly did feel for us like the laws of physics had changed,” says Ternus. “All of a sudden, we could build a MacBook Air with no fan with 18 hours of battery life,” he says. “The best arrow in our quiver is efficiency. Because if you can improve efficiency, everything gets better.”

Levy has been covering the Mac from the beginning. His article is a fascinating look back at important moments in the computer’s history and at where it stands today.

Apple silicon is just the latest inflection point for a computer that has seen more than its fair share of changes over four decades. For a while, it looked like the Mac would be relegated to history’s dustbin – left behind by the iPhone. But, it’s the very success of the iPhone formed the foundation of some of the greatest strengths of today’s Mac. It’s an age-old story of reclaimed success built on reinvention necessitated to avert irrelevance.

Permalink