Notebookcheck Logo

Apple Vision Pro AI 'mind-reading' patent summary leads to dystopian FUD for future visionOS users

Will the Vision Pro track your brain? Maybe. (Source: Apple)
Will the Vision Pro track your brain? Maybe. (Source: Apple)
The inaugural Vision Pro's launch has prompted a Twitter thread from a researcher who contributed to "neurotechnology" development at Apple outlining details on the headset (or its possible successors) that might be interpreted as meaning that the hardware (or its AI, rather) is capable of reading the user's mind. That may or may not be the case; nevertheless, the thread has helped fuel reservations on visionOS-powered devices and their potential effects on users and humanity as a whole.

Apple's triumphant preview of the Vision Pro during WWDC23 has garnered Black Mirror references from all axes ever since, largely thanks to claims of a precision-crafted experience that comes as close to a 'brain-chip' experience as humanly possible, especially for a first-gen XR headset.

Now, Sterling Crispin, a Twitter user who claims to have "spent 10% of my life contributing to the development of the #VisionPro while I worked at Apple as a Neurotechnology Prototyping Researcher in the Technology Development Group" does indeed purport to confirm that the device is, (or, perhaps more accurately, will one day become) "a crude brain computer interface".

Crispin elaborates on this definition with an effusive (and mildly redacted due to NDAs) Twitter thread on the patent  to which he contributed in the course of Vision Pro development at Apple. They apparently include IP on AI rated to predict "emotional and cognitive responses" from the user, via various headset-mounted sensors.

This may result in the ability to "infer" this data by "quickly flashing visuals or sounds to a user in ways they may not perceive, and then measuring their reaction to it". These techniques may have applications such as sophisticated click-prediction through analyzing the user's pupils.

Apple's neurotech researchers apparently observed that this part of the human eye tends to move in specific ways prior to the equivalent of a mouse-click on the headset. Furthermore, the AI is also programmed to "create biofeedback with a user's brain by monitoring their eye behavior, and [redesign] the UI in real time to create more of this anticipatory pupil response".

That may read as excessively intrusive on first glance - then again, the "biofeedback" in question is now commonly used in non-invasive healthcare contexts to help treat a range of physical and mental conditions. In addition, Crispin also indicated that the resulting machine learning might mostly function to analyze progress in Vision Pro use-cases such as XR learning resources or the new Mindfulness feature in any case.

It also apparently depends on a range of sensors rated to measure or estimate "electrical activity in the brain; heart beats and rhythms; muscle activity; blood density in the brain; blood pressure, and skin conductance" - none of which have been mentioned in relation to the Vision Pro alongside its advanced eye-tracking capabilities thus far.

In addition, Apple categorically stated that it would not be privy to data on user gaze in apps such as the Safari browser during its WWDC23 keynote. Nevertheless, Crispin's thread is now one more source of inspiration for thinkpieces on the potential downside of Apple's ambition to make computing spatial in the future.

Buy the MUSE 2 meditation-tracking "Multi Sensor Headset" on Amazon

static version load dynamic
Loading Comments
Comment on this article
Please share our article, every link counts!
> Expert Reviews and News on Laptops, Smartphones and Tech Innovations > News > News Archive > Newsarchive 2023 06 > Apple Vision Pro AI 'mind-reading' patent summary leads to dystopian FUD for future visionOS users
Deirdre O'Donnell, 2023-06-10 (Update: 2023-06-10)