Posts tagged with "visionOS"

The Best Way to Take Screenshots on Apple Vision Pro

Taking good-looking screenshots on the Apple Vision Pro isn’t easy, but it’s not impossible either. I’ve already spent many hours taking screenshots on the device, and I thought I’d share my experience and some practical tips for getting the best screenshots possible.

Although I’ve only had the Apple Vision Pro for a week, I’ve already spent a lot of time thinking about and refining my screenshot workflow out of necessity. That’s because after I spent around three hours writing my first visionOS app review of CARROT Weather and Mercury Weather, I spent at least as much time trying to get the screenshots I wanted. If that had been a review of the iOS versions of those apps, the same number of screenshots would have taken less than a half hour. That’s a problem because I simply don’t have that much time to devote to screenshots.

Taking screenshots with the Apple Vision Pro is difficult because of the way the device works. Like other headsets, the Apple Vision Pro uses something called foveated rendering, a technique that’s used to reduce the computing power needed to display the headset’s images. In practical terms, the technique means that the only part of the device’s view that is in focus is where you’re looking. The focal point changes as your eyes move, so you don’t notice that part of the view is blurry. In fact, this is how the human eye works, so as long as the eye tracking is good, which it is on the Apple Vision Pro, the experience is good too.

However, as well as foveated rendering works for using the Apple Vision Pro, it’s terrible for screenshots. You can take a quick screenshot by pressing the top button and Digital Crown, but you’ll immediately see that everything except where you were looking when you took the screen-grab is out of focus. That’s fine for sharing a quick image with a friend, but if you want something suitable for publishing, it’s not a good option.

Fortunately, Apple thought of this, and there’s a solution, but it involves using Xcode and another developer tool. Of course, using Xcode to take screenshots is a little like using Logic Pro to record voice memos, except there are plenty of simple apps for recording voice memos, whereas Xcode is currently your only choice for taking crisp screenshots on the Vision Pro. So until there’s another option, it pays to learn your way around these developer tools to get the highest quality screenshots as efficiently as possible.

Read more


The Apple Vision Pro Developer Strap

Jeff Benjamin writing for 9to5Mac has a comprehensive breakdown on what the Apple Vision Pro Developer Strap can and can’t do. One of the primary benefits for developers is capturing video. As Benjamin writes:

The Developer Strap also lets developers capture a direct video feed from Apple Vision Pro via a wired USB-C connection using Reality Composer Pro. Files transfers of the captured feed occur via the direct USB-C connection. Users without the strap can still capture these feeds but via Wi-Fi only.

Benjamin also explains how to use the strap to access Recovery Mode:

You can also restore visionOS using Recovery Mode via the wired connection made possible by the Developer Strap. This includes downgrading from visionOS beta releases.

My experience is in line with Benjamin’s. The Developer Strap may make capturing short videos and screenshots easier, but it can’t do much else.

I will add, however, that I was contacted by a MacStories reader who tipped me off to one other thing the Developer Strap can do, which is act as a video source for QuickTime. This works a lot like capturing screenshots and video from an Apple TV via QuickTime, and the advantage is that you can capture more than the 60-second cap imposed by Reality Composer Pro. That’s great, except that the capture is foveated, meaning that the video recorded will be blurry everywhere except where you’re looking.

Permalink

Vision Pro App Spotlight: NowPlaying

NowPlaying by Hidde van der Ploeg has come a long way since its start. It’s always been an excellent companion to Apple Music, packed with music discovery features that fill a big gap in Apple’s system app. But, with the visionOS version, van der Ploeg has taken NowPlaying to a new level. visionOS allows users of the app to spread out, focus on the music, and absorb the rich catalog of metadata and editorial content about their music in a beautiful, relaxing atmosphere.

Read more


Vision Pro App Spotlight: Day Ahead

Day Ahead is an interesting approach to visualizing the events on your calendar. It’s a visionOS-only app that uses what looks like a transparent tube filled with drops of colored liquid that represent the events of your day. It’s strange, but I think there’s something to it that we’ll be seeing from other visionOS developers as they explore the unique characteristics of the Apple Vision Pro.

Read more


Vision Pro App Spotlight: Juno

Leading up to the release of Apple Vision Pro, there was as much, or perhaps more, talk about which apps wouldn’t be on the platform on day one as there was about which would. To be sure, there are some very notable holes in the Vision Pro’s catalog, and one of the biggest is YouTube. However, as we’ll see over and over with the Vision Pro apps we’ll be covering at MacStories, the gap isn’t nearly as bad as you’d think, thanks to developers like Christian Selig, the former maker of the Reddit client Apollo and his brand new app, Juno.

Read more


Vision Pro App Spotlight: CARROT Weather and Mercury Weather

We’re going to be covering a lot of visionOS apps over the coming weeks, so I thought a fitting place to start would be with two of our favorite weather apps from other Apple platforms: CARROT Weather and Mercury Weather. Both apps are past MacStories Selects award winners. CARROT Weather won the Best Watch app in 2020 and the Readers’ Choice award in 2022, and we named Mercury Weather the Best Design winner of 2023. So, I expect a lot of readers are already familiar with both apps. However, if you’re not, be sure to check out these past stories for more on what makes them two of our favorite weather apps on the iPhone, iPad, Mac, and Apple Watch:

So today, my focus is solely on the visionOS versions of both apps, which fill the gap left by Apple’s curious omission of its own Weather app from Vision Pro.

Read more


Apple Publishes Vision Pro User Guide and Support Documentation

Apple has created a comprehensive Vision Pro user guide and accompanying support documentation that was published late last week. I’ve spent some time browsing through the user guide, and it’s full of excellent tips to help get people started with the new device. Each section of the guide links to related support documents, which go into more depth about the topics covered. I’ve pinned the page in Safari as I continue to explore everything the Apple Vision Pro can do.

Another page worth bookmarking is a story by Joe Rossignol and Aaron Perris of MacRumors, who compiled a long list of what they describe as ‘nearly all’ of the support documents listed in the Vision Pro user guide. You’ll come across links to these documents in the user guide itself, but if you want to go deeper on a topic and bypass the user guide, MacRumors’ story is a great place to start.


On Vision Pro’s Spatial Computing

There’s no tech commentator better equipped to talk about the history of spatial interfaces in Apple operating systems than John Siracusa, and I enjoyed his latest, thought-provoking column on where visionOS and the Vision Pro’s gesture system fit in the spatial computing world:

Where Vision Pro may stumble is in its interface to the deep, spatial world it provides. We all know how to reach out and “directly manipulate” objects in the real world, but that’s not what Vision Pro asks us to do. Instead, Vision Pro requires us to first look at the thing we want to manipulate, and then perform an “indirect” gesture with our hands to operate on it.

Is this look-then-gesture interaction any different than using a mouse to “indirectly” manipulate a pointer? Does it leverage our innate spatial abilities to the same extent? Time will tell. But I feel comfortable saying that, in some ways, this kind of Vision Pro interaction is less “direct” than the iPhone’s touch interface, where we see a thing on a screen and then literally place our fingers on it. Will there be any interaction on the Vision Pro that’s as intuitive, efficient, and satisfying as flick-scrolling on an iPhone screen? It’s a high bar to clear, that’s for sure.

In yesterday’s review on The Verge, Nilay Patel shared a similar idea: it’s a strange feeling to use a computer that requires you to look at what you want to control at all times. I don’t know what to think about this yet since I don’t have a Vision Pro, but I’m curious to learn how this interaction method will scale over time as we start using this new platform on a daily basis. It’s quite fitting, however, that visionOS is based on the one Apple platform that supports both kinds of manipulation with a pointer and touch.

Permalink

What Reviewers Have Learned about Apple Vision Pro

By the time most Apple hardware is released, we usually know every minute detail of the specs and have a pretty good idea of what using it will be like. That hasn’t been the case with the Apple Vision Pro. Apple conducted multiple waves of demos in the months since WWDC 2023, but those were tightly controlled and limited. Today, however, we’re seeing the first hardware reviews from a range of media outlets and YouTubers who have had a chance to spend about a week testing the device.

There are some excellent reviews that are well worth reading in full, but I thought I’d highlight some of the most interesting tidbits that were either unknown or unclear before now to help give readers a better sense of what this hardware is all about.

Read more