At WWDC today, Apple announced the headlining features of visionOS 26, its next big OS release for Vision Pro. Among them is a new revamped spatial photos feature that ought to make them even more immersive.

Vision Pro launched with the ability to view spatial photos, captured either with the headset itself or with iPhone 16, 15 Pro and Pro Max. These spatial photos created a sense of depth and dimensionality by combining stereo capture and applying depth mapping to the image.

Now, Apple says it’s applied a new generative AI algorithm to create “spatial scenes with multiple perspectives, letting users feel like they can lean in and look around,” essentially ‘guessing’ at details not actually captured on camera.

With visionOS 26, Vision Pro users will be able to view spatial scenes in the Photos app, Spatial Gallery app, and Safari. The company says developers will also be able to use the Spatial Scene API to add the feature into their apps.

SEE ALSO
Early Versions of Meta's Orion AR Glasses Envisioned a Neck-worn Compute Unit

To show off the new AI-assisted spatial photos feature, real-estate marketplace Zillow says it’s adopting Spatial Scene API in the Zillow Immersive app for Vision Pro, which lets users to see spatial images of homes and apartments.

Apple’s visionOS 26 is slated to arrive sometime later this year, although the company says testing is already underway.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.

Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 4,000 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.