What Is the Apple Vision Pro? A Lifesaver for Disabled Users


An illustration by Ari Liloan of the Venus de Milo wearing an Apple Vision Pro, looking at her digitally restored arms and an apple floating above her hand. When the statue was discovered in the Greek island of Melos, in 1820, a hand holding an apple was found nearby.

In her childhood bedroom, Maxine Collard had a PC connected to a cathode-ray tube monitor so massive it bowed her desk into a smile that grew deeper every year. Collard has oculocutaneous albinism, which means that her hair is naturally bleach white, her complexion maximally fair, and she has uncorrectably low visual acuity with limited depth perception. In order to see the screen, she had to crane her neck until her face was two inches from the monitor.

When Collard was in middle school, her mother bought an iMac for the family. Collard spent hours messing around on the new machine, her nose pressed almost to the glass. One day, deep in the computer’s accessibility settings, she discovered that if she held down the control key while spinning the mouse’s scroll wheel, she could instantaneously zoom the entire screen to whatever magnification level she wanted. There was a rudimentary magnifier app on her Windows computer, but she found the interface difficult to use, and the low-res image on the zoomed-in PC screen, she said, was pixelated, hard to read, “disgusting.” Her experience on the iMac, which allowed her to magnify the entire screen into a much clearer image, came as a revelation.

Earlier this year, Collard had a similar aha moment when she tried the Apple Vision Pro for the first time. Some critics of the AVP were skeptical of a device that pressed two high-resolution micro-OLED screens within millimeters of one’s eyes for hours at a time. But to Collard, the ability to (as she put it) “strap an iPad to my face” was instantly appealing.

Collard is now in her sixth year of a combined Ph.D.-M.D. neuroscience degree at UC San Francisco. When I visited her at her lab in late May, she showed me her workspace: a standing cubicle in a small hive of carrels she shared with her colleagues, a pair of 27-inch monitors on her desk. Zooming her entire screen has its liabilities in a social setting like this: One day she was reading her DMs on Slack, magnified so much that the words were two inches tall. A co-worker sent her a spicy message, something she would have preferred to keep private, or at least in 11-point type, but instead it was broadcast for all her colleagues to see.

After she got an AVP, she had unprecedented control over her visual environment. She took her lab’s Slack channels and enlarged them to the size of a refrigerator, and set them off to her right. Then she opened her code editor and set it in front of her — inches from her eyes, like usual, but five times the size of her external monitors, and her posture was ramrod straight — no more craning. Finally, she opened a browser window, stretched it to the size of a door frame, and loaded the documentation for a tricky data-analysis function she could never remember, and set it off to her left.

Collard has strabismus — her eyes don’t align the way typical eyes do — which would confuse most eye-tracking algorithms, but in the AVP’s accessibility menu, she turned on “single-eye tracking,” so the device wouldn’t get confused by eyes that point in different directions. The device can lessen the effects of her nystagmus — involuntary eye “wiggles” that have confounded eye-tracking devices she’s used in the past.

The AVP has a range of accessibility features for other disabilities as well: Blind users can use VoiceOver, a screen reader that will speak text, using a custom set of hand gestures to navigate through apps. People with mobility disabilities can make selections through a variety of alternative methods: with their voice, or using a switch or joystick (easier for some users with motor disabilities), or with a feature called Dwell Control, which allows a user to make a selection simply by “dwelling” their gaze on an item. With sound actions, a user can make a selection with a custom noise (like a cluck or a pop). In lieu of eye gaze, the pointer can be controlled with one’s head, wrist, or finger, and most of the accessibility features users are familiar with from other Apple products — reduced motion, color filters for color-blindness, and hearing-device support — are included.

Because of her reliance on large monitors, Collard could never comfortably join her colleagues to debug code in a coffee shop or in the shared kitchen one level down from their sixth-floor lab. That’s all changed with the AVP. “As a disabled person,” she wrote in a blog post, “the ability to finally sit back with my feet up on a bench out in the sun while working on my laptop — or more accurately, while working on a 30-foot-wide 4K screen floating in exactly the perfect ergonomic position, one that I can reposition anywhere I want it to be in any moment — is the answer to decades of prayers to the accessibility gods.”

Mission Bay was warm and breezy in late May, and Collard led me down to Koret Quad, where she now loved to sit and work. Inside her headset, a code editor the size of a garden shed floated above the grass. As she worked, she saw the window begin to shimmer and a shadowy figure troubled the lines of code. Then a man, smiling and looking right at her, strode through the window of her workspace and stopped. This sort of thing happens to Collard whenever she takes her AVP out in public — she has caught numerous people taking surreptitious selfies with her in the frame.

“Hi there,” she said preemptively to the smiling man, who was clearly drawn by the novelty of seeing an Apple Vision Pro en plein air. He looked at Collard and said, in a lightly mocking tone, “How’s that working out for you?”

Collard has struggled with her identity as a disabled person, resisting alien-seeming assistive tools like the monocular lenses that low-vision specialists tried to get her to use in school. But she sees the AVP as a liberatory device, and no arch tech skeptic on the quad could dampen that feeling. She fixed his gaze with her digital SeeThrough avatar eyes and answered him with emphatic cheer: “Really great, in fact!”

Maxine Collard at her home in San Francisco, pictured wearing an Apple Vision Pro and seated at a keyboard.

Maxine Collard at her home in San Francisco.
Photo: Courtesy of Maxine Collard

The initial response to the Apple Vision Pro has been mixed. There are widespread complaints about the headset’s weight and battery life and its price — $3,500 for the lowest-end model. Sales have reportedly been sluggish. Kevin Roose, a technology columnist for the New York Times, recently wrote that he “couldn’t really figure out what it was for.” For many disabled users, however, the answer is clear: The Vision Pro is made for them.

About ten years ago, Steve Coulson, a creative director in New York, began losing his hearing, and today he has profound hearing loss. This made his work — constant in-person meetings, often in noisy environments — increasingly difficult. Like many people with disabilities, Coulson found the pandemic isolating, but he embraced certain elements of remote access: He had more control over the audio in virtual meetings, not to mention real-time captions and auto-generated transcripts.

Still, he missed the easy, dynamic exchanges of his pre-pandemic brainstorms, and his hearing loss made it difficult to reproduce that feeling in person, even with hearing aids. Now, meeting with his business partner in Spatial FaceTime on Vision Pro, he says, the feeling he’d lost has been restored. “It feels like I’m in a room again,” he said. “We can just sit together in a meeting, and I can hear.” This technology, Coulson said, “is life-changing in a way that a hearing person might not understand.”

Michael Doise, who works as an accessibility specialist and app developer in Austin, has optic-nerve hypoplasia — his optic nerves didn’t fully develop when he was born. When he’s with his family, he rarely sees their facial expressions, since it would be awkward to hold a portable magnifier up to their faces while they hang out. Even on his computer, he has trouble magnifying their images efficiently. But on a group video call, wearing his AVP, “I could actually see their facial expressions,” he said. “It’s a remarkable feat of engineering for someone who’s blind. Are they happy? Smiling? Knowing what all that looks like is huge for me.”

Neurodiverse users have also found value in the AVP. “I generally feel a lot better after having worn it for a while,” a user with autism and ADHD told me. “It’s like a reset for the brain.” When I chatted with them, they’d just drained their AVP’s battery by spacing out in the immersive lunar environment. “My brain just is hyperfocused on whatever stimulus comes in, so whatever I can do to manually cut those stimuli off helps me tremendously,” they said. “The Vision Pro is noise-canceling headphones for my eyes.”

Ryan Hudson-Peralta, who was born with no hands and short legs he’s unable to walk on, remembers his first computer, in middle school. He would go from class to class with a bulky ’90s-era Windows laptop on his wheelchair, typing notes and using rudimentary dictation software to complete his assignments. But he had to contort himself just to log in: “I was literally putting my lip on the control button, using my nose or arm to tap the other button,” he said. Then someone showed him an Apple computer, which had a function called Sticky Keys, allowing him to temporarily lock multiple keys on the keyboard, freeing him from gymnastic approaches to chorded commands.

Today, Hudson-Peralta drives his adaptive SUV to his job in downtown Detroit, putting in long hours as a principal designer at Rocket Mortgage, where he designs the company’s apps and websites using a traditional mouse and keyboard and his Mac’s accessibility features. One morning this spring, though, his back flared with pain. “I was having trouble getting around that day,” he said, so he took the day off and did work for his consulting agency, Equal Accessibility, from bed, wearing the Vision Pro, surrounded by screens he controlled with his eyes and a series of custom mouth-sounds that triggered selections. “As I get older, and this happens more often for me,” he said, “I envision myself working virtually with the AVP even more.”

For now, though, he mainly uses the AVP for entertainment, watching immersive videos on the headset, where he takes the perspective of a player running across a soccer field, or standing in a recording booth next to Alicia Keys. “Having a disability, I never ran in my life,” Hudson-Peralta told me. “I was sitting on the floor when I watched the Alicia Keys video, and I really felt like I was standing.” He knew it was an illusion, but the immersive tech gave the illusion a visceral veracity.

I asked him what he thought of the disabled critique of Avatar, which seemed to suggest that its paralyzed protagonist’s life was only worth living when he was liberated from his wheelchair. He had no patience for this argument. “The other characters who could walk, they were jumping into avatar suits, too,” he said. The impulse toward escapism is universal.

The same week I visited Maxine Collard at her lab at UCSF, I went to Cupertino to meet some of the disabled software engineers who’d helped build the accessibility features on the Apple Vision Pro. I was ushered inside Apple Park by a Deaf member of the company’s accessibility PR team who spoke to me through an ASL interpreter. Seeing my white cane, he explained that he was speaking ASL and that the voice I heard wasn’t his. (Though I’m legally blind, I have enough residual vision that I’d clocked the interpreter myself; still, I appreciated the gesture.)

Strolling along the curved-glass perimeter of Apple’s massive ring-shaped corporate headquarters felt like walking on a treadmill. We ambled along for several minutes, the curved-glass wall unchanging on our left, the trees rolling languidly past on our right. I was there the week of Global Accessibility Awareness Day, which Apple was observing, in part, by holding a series of internal-facing events to raise awareness among its 150,000 employees about their work in the field.

It wasn’t inevitable that Apple would evolve to be so inclusive. Gregg Vanderheiden, an early accessibility consultant at the company, recalled a conversation he had with an engineer in the mid-’80s. The engineer confessed to Vanderheiden that he was worried he’d be fired for working on Sticky Keys. Vanderheiden asked why. “Because Sticky Keys is priority seven,” he replied, “and we’re under strict orders to focus on priorities one through three.”

Today, the company seems truly committed to accessibility and not just for the goodwill it might generate. After all, one in four Americans have a disability — a market any corporation would be foolish to ignore. It required no great leap for Apple to serve this population: All technology, in the most basic sense, is prosthetic.

We reached the conference center, where someone handed me an Apple Accessibility Passport, a single folded page with six colorful icons printed in a raised, textured material so they were tactilely discernible (and labeled in braille), representing the five main categories of Apple’s accessibility offerings: vision, mobility, speech, hearing, and “cognitive” — plus, perhaps a bit hopefully, a category devoted exclusively to Apple Vision Pro, which was represented on the passport by a raised-line silhouette of the goggles.

The room was set up like a science fair with stations representing each category. After visiting each of the six stations, employees received a tactile bump-dot sticker to place on their passports. One of my minders, another member of the accessibility PR team, cheerfully insisted on applying my tactile bump-sticker every time I visited a booth.

I smiled and nodded through our first stop at the Vision station as an Apple employee dutifully showed me how to use the magnifier app that I’d used at the airport that morning to read what was for me the illegibly distant menu at an airport café. Another worker showed me an iPad set to Assistive Access mode for users with cognitive disabilities. The normally dense screen of apps was reduced to five huge buttons, and each of these apps had its complex functions hidden or renamed. I saw a demonstration of how, using Apple Home and a smart bulb, d/Deaf or hard-of-hearing users can have their lamp change color when the doorbell rings or alert them visually or haptically to the sound of a baby crying or a fire alarm.

Making my way around the stations, it struck me how many of these features were also available as specialty assistive devices. But these devices tend to be overpriced and undersupported, and their producers go out of business or stop supporting products with alarming frequency. It would cost me about as much as an iPhone to buy a portable video magnifier with all the features of Apple’s built-in magnifier app, and whatever advantages I’d find in a device built explicitly for a person with low vision I’d lose in my need to carry an extra gadget around. A speciality device would also inevitably have inferior construction, not to mention the weirdness of pulling out an unfamiliar device in public, as opposed to the same little brick everyone else at the airport is waving around.

The AVP isn’t as familiar as the iPhone, but, as I experienced with Collard, in public people are more likely to ask for a review than an explanation. Under the hood, however, it’s as specialized as any bespoke piece of disability tech. At the AVP booth, I met Dan Golden, a software engineer with low vision who works on accessibility at Apple across platforms. Golden told me that during the development process, he had trouble using eye-tracking. One of his fellow engineers was working on pointer control — a way of turning off eye-tracking on AVP and making selections by pointing one’s head. They shared this incomplete feature with Golden, who immediately began using it in his own testing of the device, in turn giving his colleague feedback to refine it.

As Golden spoke, I was struck by his resemblance to Collard. His story echoed hers in many ways, down to the epiphanic childhood discovery of the iMac’s full-screen zoom shortcut, which a teacher showed him when he couldn’t follow the flying letters on a learn-to-type program. After that, he begged his parents for a Mac, and has been “an Apple person” ever since.

There is an adage in disability-rights circles, “Nothing about us without us,” which suggests that the only way equity and inclusivity for disabled people — which is another way of saying real accessibility — will be realized is if people with disabilities are a part of the design process. Here, I felt with sudden force, was a stark instance of this ethos in action: a low-vision software engineer with strabismus who had helped to design and test the Apple Vision Pro, and on the other end of that equation was Maxine Collard, with a similar suite of disabilities, who could barely contain her enthusiasm at this tool that allowed her to do her work with a freedom and ease she’d never imagined for herself.

This epiphany had, of course, been orchestrated for me by Apple, with its accessibility PR team guiding me through the experience room, stamping my passport, then introducing me to Golden, the visually impaired engineer. But both things could be true: This was well-orchestrated PR, but it was also a real part of Apple’s corporate culture. Nearly every blind person I knew had an iPhone because it provided unparalleled ease of access to information compared with any other option on the market. Online, disabled users still grumbled constantly — updates frequently break beloved features and often take too long to fix. Many blind people have wondered with annoyance why, for instance, a $3 trillion company can’t get the screen-reader on its desktop and laptop computers working half as well as it does on its phones. But these are largely the gripes of devoted consumers.

The AVP is still a first-generation product, and there are bugs. Usman Haque, a Phoenix-based data-science manager at a large insurance company, also goes by @TwoFZeroT on social media — a.k.a. Two Fingers, Zero Toes. He’s a below-elbow, below-knee bilateral congenital amputee. Haque bought an AVP eager to test-drive its accessibility features. As soon as he turned it on, he said, “I was floored. I just spent $3,500, and I’m not giving it back. This is amazing.” That first impression, however, soon soured.

The AVP calibration process requires its users to hold up their hands so it can recognize and begin tracking them for the pinching gestures that allow one to select items. But the AVP didn’t recognize Haque’s atypical hands, so he had his daughter stand behind him and pinch at the appropriate times to let him turn on Dwell Control. This worked well for a while, but often when Haque would wake the device after a break, Dwell Control would be deactivated and he’d have to go through a frustrating series of troubleshooting moves to turn it back on, sometimes needing assistance from his family.

“I apologize for using this term,” he said, “but it felt crippling.” The device had offered Haque a thrilling sense of freedom, where he could control a vast, dazzling digital ecosystem with just his eyes — but then that freedom was suddenly yanked away. This was enough to douse his feelings of amazement. “It’s not ready for me yet,” he said. “I gave it back.”

For Ryan Hudson-Peralta, the designer at Rocket Mortgage, his problems started with unboxing the AVP: he found that he couldn’t independently press the digital crown, which is required to set up the device. To generate his “Persona” — the uncanny digital avatar that the AVP shows to FaceTime callers when a user is wearing it — you hold it out in front of you, at arms’ length, so it can capture your likeness. But because Hudson-Peralta needed his son to hold it up for him, the angle was off, and as a result his Persona looks even stranger than most users’ — it erased his neck.

Others are waiting for upgrades. Collard submitted a feature request to open up the AVP’s zoom feature to the pass-through cameras — at the moment, users can only magnify digital objects generated by the AVP and not their surroundings — like the departures monitor at an airport, for instance, or the text on a prescription-medication label. So far, Apple hasn’t changed the feature.

At Apple Park, I walked with Dan Golden, the low-vision software engineer, into a conference room where Jordyn Castor was waiting for us on a FaceTime call from Colorado. Like Golden, Castor works on vision accessibility, doing quality-assurance testing. Unlike Golden, she’s completely blind and uses VoiceOver, the screen-reader that’s built into all of Apple’s operating systems, including on her Vision Pro headset.

Castor told me that accessibility is a core value at Apple, a human right.  She described the exhilaration she felt using VoiceOver on the Vision Pro to demo a game that allowed her to play a virtual hand pan drum. She had the same wonder in her voice that sighted users express when the butterfly lands on their fingers in the device’s flagship demo, Encounter Dinosaurs. “I was playing the drums with my hands like I’m playing the drums on the table in front of me!” Castor said. “It was unlike anything I’ve experienced in the accessibility realm.”

It’s easy to look at accessibility as a binary — a device either has screen-reader functionality or it doesn’t; captions are either available or they’re not. But digital accessibility exists on a spectrum, and what works for some users won’t work for others. Castor is the only blind person I’ve been able to find who uses the Vision Pro’s screen reader reliably. Most low-vision users — including myself — who use VoiceOver on our phones or computers find it too chaotic and unwieldy on the Vision Pro. Watching Collard use her AVP, I was struck by her expertise — she drew on decades of IT troubleshooting and a bone-deep familiarity with the Apple ecosystem that she leveraged to surmount numerous small obstacles that arose as she demoed her workflow for me.

The first time I wore an AVP, I was astonished by how intuitive it was to use — within a minute or two, I was opening and closing and resizing windows, dialing down my surroundings and turning up a Joshua Tree landscape. A college student I met on InSpaze, a spatial chatroom where AVP users hang out, told me that the first time he let his older brother, who has Down syndrome, use his AVP, his brother independently played video games on it for two hours. But this native intuitiveness can fall away the further a disabled person might stray from the typical, mainstream user. I don’t doubt that Castor is able to fluidly use her AVP entirely with audible feedback, but she’s also a lifelong screen-reader user with a B.S. in computer science, not to mention a full-time engineer at Apple. Users with less expertise can struggle to figure it out. It’s also worth noting that within the chronically underemployed and impoverished disabled population, these users represent a rarefied subset who can drop a few thousand dollars — often with professional interest — on this class of first-gen tech toy.

Still, my brief experience with the AVP allowed me to imagine a future version where, for instance, the price comes down, Apple opens up the front-facing cameras to developers, and what is already a powerful low-vision device could become the ultimate tool for blind and low-vision people. When I play the complicated tabletop games my son adores, and press a game’s card to my nose to read it, I often find myself wishing I could tap on the blocks of indecipherable text the way I can with a paragraph of text on my iPhone and hear it read aloud. It’s easy to imagine a non-distant future where I could wear a fourth-gen AVP, leveraging whatever comes after GPT4o, and tap one of the game cards with my finger, and hear a readout of the text printed there, along with a description of whatever illustration is on the card, too. If I preferred to use my residual vision, I might casually use two fingers to zoom in on the card (or my son’s face) the way you’d enlarge a photo on your iPhone.

Accessibility operates, to use the language of computing, in a stack. In the example above, the game is only accessible if the AI app that’s describing my video feed can be used by a blind person. And as a blind user, I can only access that app if the operating system it’s running on is built with a disabled user in mind. If any of these links in the chain fail, then the whole system crashes.

In May, when Sonos, the smart-speaker company, updated its app, its screen-reader accessibility regressed significantly, making the app — which one needs to control the company’s speakers — unusable for its blind users, threatening to turn thousands of dollars’ worth of high-end audio gear into expensive, silent sculptures. Everyone relies on fallible technologies, but disabled users are in an especially precarious position, and this experience of a smoothly functioning stack suddenly imploding with a developer’s capricious update happens constantly, including from companies that have demonstrated a commitment to accessibility in the past. (Sonos has since pledged to make accessibility improvements by mid-June.) In April, an analyst with sources from inside Apple’s supply chain reported that Apple had slashed AVP shipment projections for 2024 after “demand in the U.S. market [had] fallen sharply beyond expectations.” If it doesn’t find a market, and Apple discontinues the AVP, then many disabled users will see their accessibility stacks implode.

Maxine Collard’s bedroom in San Francisco is dominated by the double-bass she played in orchestras growing up. “It’s an albino,” she said with a grin, pointing out the unusual blond-wood design, “just like me.” In high school, she tried a tragicomically long list of adaptations to read her music while simultaneously maintaining the proper posture the double-bass requires — she couldn’t lean her face into the music stand the way she could with her computer screen. In the end, she just memorized an astonishing amount of Brahms. Now, wearing her AVP, she can stretch her sheet music to the size of the wall and years of orchestral struggle are erased with a flick of her wrist.

I asked about a glowing piece of tech sitting on a table next to the bass. Perched next to a record player, it looked like a device from a 1950s electronics lab but with contemporary manufacturing. “Oh, that’s a big tube amp,” she told me. “With turntables, the whole argument that somehow they sound better is, like, quantitatively not true,” she said. “Vinyl’s not going to do a better job of reproducing audio than digital is. There’s good math that proves it.” She gazed at the amp through the black-mirrored surface of her AVP. “What is true,” she said, “is that the particular way in which it is worse — it’s better. The distortions this amp gives to the sound, they’re actually just way more pleasant-sounding, if you’re a human.”

This idea illuminates an important aspect of accessibility: There is a quantitative way of looking at disability, where without hands, or working eyes, you’re mathematically lower on the human number line (priority seven, rather than two or three, in the logic of software development). It seems like common sense that using a keyboard with Sticky Keys, or reading email with a screen reader, or browsing the web with a joystick, should be a second-class experience. But when the tools work, and the signal comes through clearly, whatever minor distortions exist can feel as warm and invisible as surface noise on a vinyl record.

At the end of my visit to Apple Park, I sat in the ring building’s central garden with Sarah Herrlinger, who’s advocated for people with disabilities within the company for more than 20 years. When she told me that Apple aims to re-create “surprise and delight” for all its users, including those with disabilities, I brushed the comment aside as more Jobsian marketing copy. But the next day, in Collard’s apartment, watching her demonstrate how she used to play video games (knees pressed against her TV console, nose almost to the 60-inch screen) and how she does now (lying on her bed playing Metroid Prime on a screen the size of her ceiling), I saw her point. The Vision Pro wasn’t giving Collard superpowers, or correcting her vision, or erasing her disability. It just gave her access to the same experience — of efficiency, competence, and pleasure — that most mainstream users accept as a given.

Correction: This story has been updated to more accurately describe Dan Golden. 



Source link