Security Bite: What exactly does Vision Pro tell you about your environment?

Security Bite: What exactly does Vision Pro tell you about your environment?

Last week Apple finally published a detailed study VisionPro and visionOS Data protection overview. While the company could have been available at launch, it helps explain what exactly the spatial computer collects from our environment and sends to third-party applications, and much more…


9to5Mac Security Bite is brought to you exclusively by Mosyle, the only Apple Unified Platform. Our sole goal is to make Apple devices operational and business secure. Our unique integrated management and security approach combines state-of-the-art Apple-specific security solutions for fully automated hardening & compliance, next generation EDR, AI-powered zero trust and exclusive privilege management with the most powerful and modern Apple MDM on the market. The result is a fully automated Apple Unified Platform that is currently trusted by over 45,000 companies to get millions of Apple devices up and running without hassle and at an affordable cost. Request your EXTENDED TRIAL Find out today why Mosyle has everything you need to work with Apple.


Privacy, Shmivacy: Why should I care?

For many of the security researchers I speak with, the mention of mixed reality is fraught with great concern. While consumers are more concerned about the Vision Pro’s nearly $4,000 price tag, those in the security space seem more aware of the dangers. After all, this device has six microphones and twelve cameras that you carry around your home.

As I highlighted in a previous Security Bite postthe general privacy The risks of Apple Vision Pro or other headsets can be alarming. For example, the distance to the ground measured by depth sensors can determine a user’s height. The sound of a train passing by could help indicate a physical location. A user’s head moments can be used to determine emotional and neurological states. The data collected about the user’s eyes is arguably the most concerning. Not only could this lead to targeted advertising and behavioral profiling, but it could also reveal sensitive health information. It is not uncommon for ophthalmologists to help diagnose illnesses in patients simply by looking at their eyes.

New details about data protection in the Vision Pro environment

Although seemingly real, environments in the Apple Vision Pro are created using a combination of camera and LiDAR data to provide a near real-time view of a user’s space. In addition, visionOS uses audio ray tracing to simulate the behavior of sound waves when interacting with objects and surfaces. Applications overlay these scenes or, in some cases, create their own environments.

With the release of the new Vision Pro Privacy OverviewWe can now better understand what environmental data is sent by the headset and shared with applications.

  1. Airplane Estimation: Detects nearby flat surfaces on which virtual 3D objects, or what Apple calls Volumes, can be placed. It improves immersion by allowing users to interact with virtual objects as part of their physical environment.
  2. Scene reconstruction: Scene reconstruction creates a polygonal mesh that accurately represents the outlines of objects in the user’s physical space. This mesh helps to correctly align virtual objects with physical objects in the user environment.
  3. Image anchoring: This feature ensures that virtual objects remain anchored in their intended position relative to real objects even when the user moves. The WSJ’s Joanna Stern demonstrated this technology early on in a video posted on X, in which she placed multiple timers over cooking objects on a stove.
  4. Object detection: Apple says it uses object detection to identify “interesting objects in your room.” In the broadest sense, it is used by Vision Pro to detect what is around you.

By default, apps on Vision Pro cannot access environmental data. To make the experience even more realistic, third-party developers may want access to this environmental data. This is a similar process to tapping to access photos or camera on an iPhone. A Full Space in Vision Pro can access environmental data to enable even more immersive experiences.

“Encounter Dinosaurs, for example, requires access to your environment so that the dinosaurs can charge through your physical space. By giving an app access to environmental data, the app can use a scene mesh to map the world around you, detect objects around you, and determine the location of specific objects around you.” Apple explains.

However, each app only has access to information about your surroundings within a five meter radius of your location. Therefore, from this distance there are no longer any immersion elements such as shadows and reflective areas.

FTC: We use income generating auto affiliate links. More.

Previous Article

New FDA-approved drug makes severe food allergies less life-threatening

Next Article

Google Maps “Suggest a Change” Vulnerability: Second Reports Approved (Sometimes)

Subscribe to our Newsletter

Subscribe to our email newsletter to get the latest posts delivered right to your email.
Pure inspiration, zero spam ✨