Vision Pro is an impressive headset but it’s not perfect right out of the gate. Luckily, Apple has a great track record of improving products for years through software updates, and Vision Pro isn’t likely to be an exception. But what should the company focus on first? Here’s what I’d like to see.

Apple first revealed Vision Pro to the world on June 5th, 2023, at its annual WWDC event.

This year, WWDC 2024 is being held on June 10th, just after the one year anniversary of Vision Pro’s announcement. That’s where we expect the company will reveal details of its first major software update to the headset’s VisionOS operating system.

It won’t be long before we see what Apple has in store, but after using the headset for months, here’s our wishlist of features we hope to see in VisionOS 2.0.

Update (June 4th, 2024): With WWDC just around the corner, we’ve updated and added to our wishlist of VisionOS 2.0 features.

Window Management

Even if Vision Pro did nothing else, it’s pretty great at making virtual windows appear in the world around you. You can put them anywhere you want, change their size, and use them for apps, games, movies, music, and more. You can place screens all over your house, or wrap yourself in a bubble of screens for the ultimate productivity space.

But while you can individually move and resize windows at will, the windows are largely unaware of each other. They will fade out nicely on the edges so they don’t clip through one another. But it’s very easy to spawn a window in front of another window, making the one in the back inaccessible without moving the first one out of the way.

There’s got to be a better way to manage many windows, and this is the #1 thing I’d like to see in the first big Vision Pro update. And there’s a few different features that should probably be considered.


For one, it would be nice to be able to ‘pin’ a window so it never moves, even if you recenter the device (which recalls all windows in front of you). This would allow users to make permanent arrangements of windows around their home and other frequented locations.


You can pick up windows and literally carry them with you from one room to another, but sometimes you just want the window to be a good boy and follow you on its own. A ‘follow’ mode would allow windows to hover near you. You might think they should just attach directly to your face like a HUD, but in practice that’s pretty uncomfortable. Having a ‘soft’ follow system that just slides the window closer once you get a certain distance away would be ideal.

Presets & Groups

I find myself commonly setting up similar window arrangements in Vision Pro. The most commonly used is one big window in front of me with two smaller windows flanking it on the left and right. I’d love to be able to save this arrangement as a preset that I could open with one click, instead of opening and placing three different windows.

And window grouping would synergize nicely with this, not only making it easy to define which windows you want to save into a preset, but allowing you to move all of them as a unit, without breaking their existing arrangement.

App Flow

In Vision Pro, every app essentially becomes its own window. So if you typically jump between an email, calendar, music player, task list, team chat, and browser… well you quickly end up with a lot of windows surrounding you. Not only do you run quickly out of space, too many windows means needing to move your head around to look at them, which isn’t ideal for productivity.

‘App Flow’ could let you put multiple apps into a single window, but let you swipe between them with ease. And yes, I’m calling this ‘App Flow’ as a nod to Apple’s Cover Flow system that could be a great conceptual starting point for this kind of feature by making every album cover and app window.

Mission Control

And of course there’s Mission Control, the MacOS view that explodes all windows right in front of you, so you can quickly pick one to focus on. A simple gesture to use something like Mission Control on Vision Pro could put a miniaturized view of all windows in front of you, making it easy to pick the one you want to see rat that moment.

Multiple Users

Image courtesy Apple

I get it, Apple. Vision Pro is custom-fit to every customer, so you don’t want people wearing the headset long-term if it isn’t fit for them. And that’s probably a big reason why you can only have a single user profile on the device. Because there’s only one user profile, Vision Pro is inherently not a very sharable device; after all, all of your stuff is right in there… emails, texts, browsing history, etc. It would be like handing someone your phone for the day. Guest Mode is useful for quick demos, but not a replacement for real multi-user support.

But would it be that hard to let someone else who wants to use the same headset do a face scan and order themselves a fitting facepad? I mean, the thing is already easy to swap thanks to the magnets. It would be trivial to pull one out and pop the other in when changing users.

So that’s the fit issue mostly solved, and then there’s the software side. OpticID is already the absolutely ideal solution for multiple users. Because the headset can uniquely and securely identify people by their retina, it could easily differentiate between two registered users without even needing to tell the headset which profile to use. Not only would this point each person to their own apps and content, but the device could quickly load each user’s stored eye-calibration and IPD values for seamless switching.

This feels like a no-brainer, but considering that iPhones and iPads don’t have readily available multiple user profiles, we’re not sure we’ll ever see it.

Some of Vision Pro's Biggest New Development Features Are Restricted to Enterprise

Charging Indicator on the External Display

The Vision Pro battery has a single LED that I’ve only ever seen turn green or orange. Orange presumably means ‘low battery’ but how low? And what exactly does green mean? Fully charged? Mostly full?

All of these questions could be easily answered by putting a charging indicator on the headset’s front display when it’s plugged in but not being worm. A tiny icon with a battery would be fine, but they could even do something a little more interesting by slowly filling up the whole display from left to right with that cool cloudy aesthetic they use when you’re in fully immersed mode.

We actually know for a fact that Apple has already thought about this kind of usage for the display way back when they first drew up the patent.

Multiple Mac Displays (or better yet, virtualized app windows)

Image courtesy Apple

Vision Pro seamlessly connects to a local Mac computer on the same network and presents a sharp virtual display. With it you can use the full power and features of your computer from inside the headset, and you even multitask with Vision Pro apps floating alongside it.

But unfortunately this is limited to a single virtual desktop. Many professionals use more than one display so they can spread out their work with less window management. So it’s clear why someone would want to use more than one virtual desktop.

Better yet, a button to create not virtual desktops… but virtual applications… could be great. Instead of a single box showing your Mac desktop, what if each application could spawn its own window inside your headset, just like other VisionOS apps? This would definitely introduce a lot more technical complexity than just having two virtual desktops, but it would surely be the most seamless way to implement this feature.

Desk Mode

Hand occlusion on Vision Pro makes your real hands show up in front of windows and inside fully immersive content, making the virtual content in front of you seem far more real. Without it, your hands would always appear ‘behind’ windows, no matter how far the window is from you, which breaks the illusion.

But it’s only your hands and arms that can show through. If you’re holding a cup of coffee or using a keyboard and then flip to a fully immersive environment, you’ll see your arms and hands… but the coffee cup becomes invisible. It’s actually a pretty weird experience; almost like the headset is erasing a part of reality that’s right in front of you.

‘Desk Mode’ could continue to reveal anything that’s on a plane in front of you (your desk or table). So your keyboard, coffee mug, notebook, phone, and more could continue to be visible even if you want to be fully immersed.

Unpacking the VR Design Details of 'Half-Life: Alyx' – Inside XR Design


As the first version of this headset, it’s not surprising to find some parts of the experience that just need to be a little easier to do. Opening notifications or the Control Center, for instance, requires looking above you and then clicking on a tiny dot to expand it. Then from there you need to click another icon to reach the actual Control Center.

For how often Control Center gets used, it would really benefit from a system-wide hand gesture to bring it into focus. Gestures are pre-determined hand motions the headset can recognize. Quest, for instance, has a gesture where turning your palm upward and then pinching your fingers will open a quick-menu, which is very handy.

Something similar on Vision Pro would not only make it much faster to check the time and get to Control Center, but it would also make getting to your notifications way faster.

There’s a few other ripe opportunities to speed up Vision Pro with gestures. Opening the app menu, for instance, probably shouldn’t require reaching all the way up to the headset to press a button you can’t see. And gestures should surely play a role in window management.

A Better Headstrap and Thinner Light Shield

Ok, this one obviously can’t come as a software update, but with the intentional modularity of the headset’s strap, it’s something that Apple could easily add.

I’ve said it before and I’ll say it again: Vision Pro isn’t uncomfortable because of it’s weight… it’s uncomfortable because of its strap. Apple clearly didn’t want a bulky strap like we see on many other headsets, but this form-over-function approach has hampered the headset’s ergonomics.

We’re glad they at least included the dual-loop strap with every headset as a more comfortable option, but there’s still big wins to be had from a rigid strap that also includes some counterbalance. And if you’re adding weight anyway, a great way to do it is with some extra battery back there! Oh and suddenly that would mean the tethered battery could be hot-swappable… the benefits are adding up.

And further, Vision Pro could use a thinner Light Shield. That’s what Apple calls the case gasket that holds the headset against your face around your eyes. This is a bit different for everyone, but when I use the headset with the Light Shield completely removed, it noticeably increases the headset’s field of view. Leaving your peripheral vision open also exposes the

A rigid strap with a battery counterbalance and an eye-relief adjustment stands to drastically improve Vision Pro’s ergonomics.

There’s no telling if or when any of these features might reach the headset, but in the mean time Vision Pro has a ton of useful tricks and settings that you’ll definitely want to know about.

What improvements do you want to see that don’t need to wait until Vision Pro 2?

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.

Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • ViRGiN

    Maybe one day it will even be able to support bluetooth mouse.

    • Daniel

      They are actively working on mouse support!

      • ViRGiN

        Yeah, they are also actively working on 1000 other things.
        Some things are pretty basic and obvious, especially for “spatial computer”. This is just Apple things.

    • Nathan

      mouse support is not trivial on the AVP. The mouse click looks very similar to the pinch gesture so the AVP could be confused

      • ViRGiN

        that’s not even a close gesture; and they surely could detect when the mouse is actually in use

        • Nathan

          it is close. The thumb and index finger when holding the mouse looks very similar to the pinch gesture. I guess they tested it and it register false “eye-pinch” clicks so they decided to exclude it. Programming-wise, excluding mouse support while supporting trackpad is some additional effort.

          • Arno van Wingerde

            IF Pinch gesture AND registered Mouse button THEN drop pinch gesture.

        • Nathan

          and you can click the mouse while it is perfectly still, so no, you can’t tell if the mouse is in use before the click happens

          • ViRGiN

            the thumb is usually resting on the side of the mouse, bending index finger is easy to notice.

            they could do a million things. like, disable hand input while mouse is in use, wait 2-3 seconds before giving back hand controls when no mouse usage is detected, also paired with where the eyes are looking.

            if they aren’t able to support standard mouse, they aren’t doing “a spatial computing device” but “a spatial content consuming device”.

            All work related usage i’ve seen across tons of videos is a pure joke. People looking into these are there for affirmation that AVP is such a great workhorse.

  • Christian Schildwaechter

    It’s encouraging that (better) window management is #1, as this problem popping up means you already use AVP enough to make sorting them manually a hassle. Which should mean it turned out to be actually useful. That’s a different kind of problem than years of complaints about basic Quest UI usability like “why can’t I order my apps”, “why do I see the feed first”, or the still ongoing “where did they hide my wishlist this time”.

    Many of the problems seem fixable, but Apple there is no guarantee they will be fixed. I wouldn’t get my hopes up for multi-user support, with AVP positioned like an XR iPad Pro, and both iPhone and iPad being single user devices. The iPad actually has multi-account support, but it’s not available for end users, only for business and educational use via MDM (Mobile Device Management). Sharing the expensive AVP would be nice, but Apple will probably just wait several generations for the price to drop enough to make buying one per person feasible. At least the virtualized app windows are already available in (pre-alpha) Ensemble, allowing to cast Mac windows to visionOS. github_com/saagarjha/Ensemble

    I hope AVP lights a fire under Meta’s butt, because we really could use more competition in XR. And after some Unity developers recreated the Apple UI on Quest Pro within days after its presentation, and even on Quest 2 with head tracking, someone on the MRL UI/UX team must have gotten fired for not implementing it years earlier, as there is no way MRL hadn’t tried this at one point.

  • STL

    We? I couldn’t care lesse. The only update I would love to see is PC VR (Steam) support!

    • Runesr2

      Exactly, the weak mobile gpu is totally burdened having to render 23 mill pixels per frame, making it so underpowered Apple focuses on non-gaming 2D apps.

      Who cares about Vision Pro? Apple don’t care about VR. Why should RoadToVR care about roads away from VR?

      Ben should spend some time focusing on roads to VR instead, like reviewing PSVR2 RE4 Remake VR and Bootstrap Island – Ben has truly lost his way.

    • Zantetsu

      Yeah, “we” as in the team that authors the articles on the site.

  • Nothing to see here

    The Vision Pro is the best thing that’s ever happened to the Meta Quest.

    • STL

      Yes. I stopped dreaming about a good VR headset from Apple and accepted Quest 3 as state of the art.

  • wcalderini

    Just looked. at the length of this post. TLDR: “They have it already, they did it on purpose for VR Dummies”. LOL

    My hot take.
    Considering the amount of time that they have been working on this project, (I started hearing rumors about it soon after the original Rift and Vive were shipping), I am going to assume that most of what you requested in this article is already working and functional.

    Apple has ALWAYS been pretty confident about exactly who their customers are. They are pretty adept at knowing who will be buying their product, how many will buy their product, and what the expectations of said product.

    My thought is they assumed, maybe correctly, that for a large portion of the initial first purchasers that this would be, or at least nearly be, their first VR/XR/Spatial Computing experience and that they designed the initial software suite, experience to reflect that,

    They knew they would probably NOT get most of the established VR/XR market for 2 reasons. The first reason is the price. Just looking at the specs that were made available most established VR/XR users already knew that other, really great, and far less expensive models of THAT experience already exists.(i t was a leap for me to make the choice, especially since I have the BigScreen Beyond on order since October).
    The second reason would be for the pre-established use cases Apple has already been touting, and their known reluctance to embrace the gaming space. Which is what the highest percentage of current VR users use the headsets for.
    Ancillary reason: Some folks, especially PC oriented folks, just hate Apple. And almost no amount of power or versatility would make them cross that line. (Haters WILL hate)

    So basically they knew they would be getting a TON of newbies to the “headheld” computer space. IMO (only) I think they released a very “training wheels” version of their computing experience just to get the early adopters acclimated to using this type of device. The “training wheels” were firmly in place.
    Some of the reporting I have been hearing about returns indicate that one of the primary reasons for returns is that it was “too complicated” to set up.
    LOL. Yeah. I know. First time user experience is almost ABSURDLY simple, and very intuitive. (Compared to the OG VIVE initial set-up or God Forbid, Pimax)

    And I can already hear the responses. “The Quest 3 is not that hard”.
    But for someone completely new to the space, especially if you are wanting a PCVR experience, it does require a certain amount of friction for first time use.
    Minimal, I agree, but a LOT more that they typical random iPhone user would encounter. AND Quest 3 is marketed as a Gaming Device. We all know it is capable of far more, but for Randy and Ramona Random it’s a gaming console for your face.

    So I think the “training wheels” concept Apple took is/was the correct approach as frustrating as it is. Most first time users are going to be almost overwhelmed by the simple concepts most of us take for granted.
    Right or wrong, I’m thinking MOST of what the article mentioned already exists in software, but will be rolled out in slow increments in the typical Apple fashion making it appear that they are delivering “new miracles” on a consistent basis.

    We already know from both IOS and MAC OS that all of the above exists and works on the Apple Silicon platform already, so I think it had to be a conscious choice to dumb it down for first time users.

    Then again. I could be wrong.
    That happens a lot too.

    But I think that by the time “Vision Pro 2 or VP lite” is ready to roll out all of that will have been “solved” incrementally JUST in time for new hardware to do it even better.

    • Ben Lang

      I think you’re right. Apple certainly could have had a more capable but also more complex interface, but building on it over time not only makes sure they’re going in the right direction, but also gives people time to train up.

      I think the question for Apple is not if they’ve thought of these things yet or have played around with them in prototypes, but which are the most important to start focusing on? That kind of directional signal comes from feedback of real users.

      I’m just hoping we don’t have to wait a whole year or so until we get some of these improvements.

  • Stephen Bard

    Since less than 1% of your readers actually own this ridiculously overpriced unversatile device, it seems like a disproportionate amount of coverage is devoted to the AVP. I guess it is still a novelty/curiosity for a little while longer.

    • Arno van Wingerde

      I have never used an AVP, nor do I intend to buy one. Nevertheless, I feel that Meta and Apple could learn a lot from each other and like to be informed about developments elsewhere. However, you are free to skip the articles that do not interest you…

      • Stephen Bard

        Yes, Apple "could" theoretically learn from others like Meta, but everyone has witnessed them making seemingly irrational decisions that fly in the face of common sense and even profit. Meta is willing to learn from others, but need the overpriced accomplishments of Apple and others to motivate/intimidate them to optimize their products. I do skip these AVP articles but I still resent that this seriously flawed device gets so much attention just because it is manufactured by this "cult".