I have low vision. A kind you can’t really correct for with glasses or contacts. I also bought Apple Vision Pro at launch. Why would I do this? Well because I’m a nerd who wants to see the future, but also because I was fascinated to see how Apple would handle accessibility for this new product. Apple’s track record on accessibility in the past decade has been stellar, in my opinion, with their teams adding powerful options every year and ensuring every new platform has accessibility support built in from the start.
After watching Apple’s WWDC23 session on visionOS accessibility, I knew accessibility on visionOS was an important point for them. But even after consuming as much information on the platform as I could, I knew I had to try it for myself to know the answer to the important question: how well does it work for me?
Terrific overview of the Accessibility features of visionOS and Vision Pro by Zach Knox.
It’s no surprise to learn that Apple’s Accessibility team did some amazing work for this new platform too, but it’s impressive to see that on day one of the Vision Pro there are already dozens of Accessibility features and accommodations in place. And keep in mind that these are Accessibility options that work with Apple apps and third-party ones, right out of the box. This is the kind of ecosystem advantage and platform integration that newfound tech reviewer Zuckerberg probably forgot to mention in his video.
See also: Tom Moore’s story on trying the Vision Pro with one eye only, Peter Saathoff-Harshfield’s Mastodon thread, Shelly Brisbin’s story for Six Colors, and Ryan Hudson Peralta’s fantastic overview (via 9to5Mac) of using the Vision Pro without hands, which I’m embedding below.