Apple unveiled Vision Pro on Monday, its long-awaited standalone headset capable of both virtual and augmented reality. While the Cupertino tech giant seems to be emphasizing Vision Pro’s AR capabilities thanks to its color passthrough cameras, it’s also going to pack one of VR’s most prominent social apps, Rec Room.

Apple’s launch of Vision Pro is still a good bit away—it’s coming first to the US in early 2024 at the hefty price of $3,500. Still, what apps the Fruit Company will allow on the undoubtedly very curated Vision App Store will be telling.

As first noted by UploadVR, among them will be the hit social VR game Rec Room, which so far shares cross-compatibility with SteamVR, Meta Quest, Meta PC VR, PSVR, PlayStation 4/5, Xbox, iOS, Android, and standard monitors via Steam.

SEE ALSO
Spring Sales Bring Deep Discounts on Top Quest & PC VR Games

Rec Room was the only native VR app shown during the part of the keynote discussing third-part apps, which are coming to the headset via Apple’s integration of the Unity game engine.

Notably, Vision Pro doesn’t offer any sort of motion controller, instead relying on hand and eye-tracking, and voice input. In the past, Rec Room has primarily targeted motion controllers for VR input, however the apps is also set to bring both full-body avatars and new hand models to the platform, which will seemingly do away with the game’s wristless mitten-hands.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 3,500 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.
  • Sinistar83

    And Rec Room still has no PSVR 2 port yet…

    • gothicvillas

      Psvr2 have 0 social apps

  • Duane Aakre

    I got my first headset, a HTC Vive, in February 2017 and RecRoom Paintball has been my most played game ever since. I don’t understand how they can do paintball without some kind of controllers with triggers to fire the weapons. If it is just going to be standing around in the RecCenter talking to hordes of little kids – a big NO THANKS!!!!!

    • Christian Schildwaechter

      You really don’t understand how this could easily work? You never pretended to hold a gun as a kid with your arm streched out, aiming at something, and then pulled the non-existing trigger in mid-air? Something that could effortlessly be detected by advanced hand tracking, and work a lot better than just a controller thanks to eye tracking telling RecRoom what you were actually aiming at when you pulled “the trigger”.

      You of course won’t get haptic feedback, but a short buzz wasn’t exactly a realistic recoil simulation anyway. And you could still get ultra-realistic 3D sound of the trigger action with the projectile noise changing according to your room and where you currently stand in it, thanks to the Vision Pro’s room accoustics “audio raytracing”. Maybe the hordes of little kids shooting imaginary guns were up to something, realizing that the core of the experience is not holding onto a piece of plastic.

      • Duane Aakre

        1) Having to say “Pew, Pew, Pew” as fast as you can for a 30 minute session of paintball would get old very fast.
        2) How does this work when your hand is outside your field of view? There will be times when you will be shooting in one direction to suppress the opposition while looking 180-degrees in the other direction to look for other opponents and depending on the IMU in the controller to keep the gun point more or less in the right direction.
        3) Frequently, you’ll have a weapon in each hand pointed in different directions and firing simultaneously. How does this work without triggers? Do you need different voice commands to fire both weapons simultaneously or each independently?

        I don’t see why Apple wouldn’t come out with optional controllers. Not coming out with controllers is leaving money on the table and why would they do that? I mean, they could probably charge $500 a piece for the controllers and until someone else comes out with an equally high-resolution headset, some of us would happily pay.

        • Christian Schildwaechter

          1) Why would you say “Pew, Pew, Pew” when all you need is bend your finger in the same position as you would if you were holding a controller?

          2) If you aim with your gun, your hands are within the field of view by definition, otherwise you are not aiming. If you regurlarly shoot from the hip during paintball, this could be a problem, but so far it seems that the Vision Pro has excellent hand tracking due to downfacing cameras. Those aren’t there for shooting from the hip, instead they are supposed to pick up subtle gestures you make with your hands lying in your lap while sitting down, but that should work great for cowboys too.

          But yes, you can of course construct a situation where this approach fails, just like you can place a Quest controller behind your back and shoot without aiming for a couple of minutes, so tracking will be lost. But you probably won’t hit anything that way, so it would be less about paintball and more about creating a very unrealistic scenario just to keep insisting that controllers/IMUs are absolutely necessary. Usually people aim when they shoot, because the general idea is to hit some target. And as mentioned before, adding eye tracking makes aiming a lot better than any controller alone ever could, and makes for a much more immersive experience.

          3) If you fire with two weapons, you still usually fire in one direction, that in which you are currently looking, unless you just want to make some noise. So as long as both of your hands are within, lets say, a 160° vision cone in front of your gaze direction, they will both be tracked, and bending either index finger can be detected as triggering a gun. Sure you will find some edge case where you want to shoot blindely, but I already covered the plausibility of that argument above.

          Hand tracking is definitely not the best solution to simulate a gun, that would be an actual gun replica with a 6DoF tracker and decent recoil mechanics. Without having access to these, controllers will do fine except for some edge cases. Without controllers, handtracking will do fine except for some other edge cases, but this may be compensated on Vision Pro by your improved aim. Ideally you’d combine the eye tracking with the tracked gun replica for the best feel and highest accuracy, and I’m sure someday someone will offer exactly that as a VR accessory for those that really hate handtracking out of principle.

          And Apple won’t offer optional controllers, because this is not a VR headset, and they don’t want it to be perceived as that. Of course it will be able to run VR apps with hand tracking, just like you can drive a thin nail into soft wood with a Quest 2 if you are very careful, but being able to run VR software on a device clearly positioned for AR doesn’t make it a VR headset any more than hitting nails makes a Quest 2 a hammer. Which is why complaints about the Quest 2 lacking a wooden handle or the Vision Pro lacking controllers are somewhat misplaced, they would just be in the way during the intended use.

          Maybe Apple will allow others to provide 3rd party controllers like the Quest Pro controllers, but that won’t matter, because games for the Vision Pro will be designed for the default input method hand tracking, and basically none of the users will own controllers. You could use optional controllers in special apps that integrate support, the same way you can use full body tracking today, meaning mostly VRChat, with other apps simply ignoring it. That may improve with OpenXR becoming more refined and getting wider support in the future, and that could even allow freely mixing input devices and using Quest Pro controllers with a Vision Pro. Just not anytime soon.

          And forget arguing with Apple that their AR headset should support controllers for VR users, let alone Apple providing them, no matter what you would pay. MacBook users have wished for touchscreens for years, Apple iPads offer some of the best touch screens available. Macs now can run iPad apps directly, but you have to use the trackpad, as Apple a long time ago said that touchscreens don’t match the MacOS UI, so we won’t get any. MacOS and iOS get closer with every release, so maybe someday they will change their mind, but until then, they just won’t support touchscreens, not even external ones.

          • Andrew Jakobs

            Have you ever played a vr shooting game with a 360 environment? I shoot sideways or behind me with looking the other way, so definitely not having my hands in range of any camera, unless the put some on the back.

          • Christian Schildwaechter

            It’s great that you all are busy coming up with edge cases where hand tracking will fail, while controller based VR will work, but to what purpose? It should be obvious that hand tracking can cover a lot of the use cases we use controllers for so far. It is also obvious that there will always be cases where controller are at least better, and sometimes the only viable solution.

            But the whole discussion was about how it would be possible to use games like RecRooms with hand tracking only, simply because there will be no controllers on AR headsets, as holding something in your hand will interfere with interacting with your environment, the very porpose AR headsets are designed for. And it is really not that hard to come up with multiple possible solutions here.

            I’m not asking anybody to give up their VR headsets and controllers. I’m just asking that people stop claiming that controllers are indispensible for any HMD. By now a lot of people have described their experiences with the Vision Pro after hand-ons, and they are pretty much all glowing about the usability due to the seamless integration of eye tracking and very good hand tracking. On a device that was presented with a quite different set of use cases than current VR HMDs for a very different target group than VR gamers.

            Shooters became a dominant genre because they work well with a mouse to point, a few buttons to click and WASD to move, and you usually interact with objects from a distance. Imagine having to do anything useful in your life with that type of interface. Now there is the first implementation of a device that no longer limits us to extremely primitive input, but basically enables all the hand-eye coordinatied interactions that we rely on in real life and which a lot of our brain activity is used for.

            And instead of spending a minute considering how this could enable new forms of interactions and improve game designs limited for decades by primitive “point-and-press-button” input not allowing for any delicate interaction, a frustrating amount of people is deliberately ignore the design purpose and the added interaction options, and instead basically demand that it should be what we already have, with bizzare edge case arguments to proof that this is the only acceptable way.

            Nobody is taking away your VR headsets and controllers, so why the widespread need to preemtively discredit a device that you have never tried, that is designed for a very different purpose than VR gaming and for a very different user group already heavily invested into Apple? I’d apprecieate if people would at least try to judge it by what it can do and was designed for, instead of mostly asking it to be something else and then declaring it useless when it isn’t.

            Sorry if this “rant” hit you for no apparent reason. I’ve just been baffled the last few days how a community so focused on future technology can be so mentally inflexible regarding other approaches, attacking them instead of embracing that we are finally getting more and different option.

        • Ondrej

          The elephant in the room here is that Apple’s vision for gaming is pancake games played with a GAMEPAD on virtual screens + some 3D gadgets and effects around you. Not true VR gaming that is exhausting, nauseating and requires bigger budgets and has big design challenges. Publishers will love the fact they can simply drop the latest Call of duty from xbox straight to Apple’s headsets and add some gadgets.

          Sad, but it’s how they see it.

    • Atlas

      They’ll probably add VR controllers next year.

  • Christian Schildwaechter

    Still, what apps the Fruit Company will allow on the undoubtedly very curated Vision App Store will be telling.

    Apple curating an App Store would be a first, esp. since the plan seems to be to give the Vision Pro a dedicated section within the current iOS App Store. Necessary, as you’d obviously need the HMD for those apps, but otherwise not separating it from the current offer, as at least for some time existing iPad apps from the regular App Store will be the first choice for productivity apps and tools.

    Apple has never curated any App Store in the way that Meta or Sony do, with limiting who is allowed to publish there based on a non-transparent case-by-case decision system. There is no way to guarantee you’re app will ever get onto the Quest store, while the Apple App Store only requires you to follow some generic rules. These may be stricter than some would like, e.g. no adult content whatsoever, no interpreted code that would allow to change the behavior at runtime and circumvent Apple’s automatic malware detection. But if you stick to these rather simple rules, you can publish pretty much whatever you want, and get the same conditions as anybody else, with the occasional Apple dick move exceptions. Apple could of course change the rules, but why should they?

    Meta legitimized the strict curation with needing to keep up a high quality level and a polished and well optimized experience on a pretty heavily underpowered VR platform, at a time when many of the 5000+ VR titles on Steam were unoptimized shovelware that could bring even the fastest GPUs to their knees for seemingly nothing. A dungheap of hastily released, unoptimized money grab Quest VR apps could have seriously tainted the perception of the new platform that was fighting to be noticed outside of a niche anyway. So keeping quality requirements high was a legitimite argument for heavy curation at least for some time. Today with 500+ apps they could probably trust that crappy apps will get flushed out by ratings. And users are protected by a return policy that allows to try apps without financial risk, so opening the Quest store somewhat shouldn’t be all critical anymore. They should at least end the disgusting “App Lab cannot be properly searched” policy.

    Apple has none of these problems with the Vision Pro. Their initial sales pitch is using either built-in functionality or one of the thousands of established iPad apps where sorting out the bad ones through user feedback already works fine. Compared to Meta trying to lure AAAs or at least AAs, Apple also has enough brand power and money to get companies like Disney as launch partners. And they have their own high quality content like 4K+ 360° multi-switchable-camera-position sport, concert and event streaming with NextVR or high bandwidth 4K 3D movies with AppleTV+. They will probably talk to a number of established VR developers to port over apps from Quest, PCVR or PSVR that could be used with (excellent) combined eye and hand tracking only. So there will most likely be a plethora of good content on launch, with no chance that a few crappy apps could ruin the image of the platform.

    No doubt a lot of XR developers will jump to the Vision Pro, if only because they will be able to place their apps on the official store, but these will make up only a small (and hopefully good) portion of all the usable content. So compared to Meta, there simply is no need to add curation to one specific section of the App Store, the mechanisms in place will work just fine for the Vision Pro too.

    • Ondrej

      Apple’s preferential treatments and limitations are awful, especially when the app has any kind of user made content.

      Even Linus Media, which is a corporation, went through hell, because they were considered too small for Apple to let them do “risky stuff” like letting users comment…

      They had they certification reverted so many times and had to remove so many features (that other apps already had) at one point they almost abandoned iOS completely.

      Walled gardens devices aren’t computers and never will be. They are a corporate anti-consumer joke. Zuckerberg talking about Quest one day replacing laptop is just as laughable.

      • Christian Schildwaechter

        “User generated content” is a problem due to a variant of the “no adult content whatsoever”. Basically Apple will not allow things that are controversial, so if you have an app that allows users to add content, you have to guarantee that they will not add anything controversial that others will see (with a number of exceptions). That of course is not possible, so you have to show that you are capable of moderating user generated content by a mix of automatization and humans that can react to complaints with in a short timespan. If you cannot provide this, you will not be allowed to let users add content, because you lack the means to keep controversial content away from the platform.

        This is one case of “These may be stricter than some would like” rules, not preferential treatments. Linus Media would probably have been free to add/keep the feature if they had hired a sufficiently large team of moderators that allowed them to provide the required oversight 24/7, with enough staff buffer to cover people being sick, on holiday etc. Something a larger company have by default to deal with the large number of potentially unacceptable user content, but which would be prohibitively expensive for a smaller company. LMG might also have outsourced user content moderation to an external service at a larger company that can match provide the coverage, but didn’t.

        I don’t know what exactly was required, but a simple rule like “you have to take down illegal user content within 15min” can easily cause such a situation. That isn’t preferential treatment of some larger companies or unfair limitations for smaller ones. That is setting a strict rule and leaving it to developers how they manage that. And if they can’t, the feature not matching up has to be removed.

        I’m not a fan of walled gardens, but there is still a very big difference between Meta not even telling developers what the rules to get onto their store are, most likely because they aren’t fixed and they don’t want people to being able to force their apps onto the Quest store by following a set of rules, and Apple setting very strict rules that may make it impossible for some developers to implement “risky” features, because they cannot fulfill some requirements that they themselves may consider excessive anyway.

        Sure, that means not anyone will be able to get apps with user generated content onto the App store, but that is because of an organizational or financial issue on the developer side, not a random decision by Apple. If that developer somehow managed to secure the funds necessary to comply with Apple’s moderation requirements, they will be able to get the app published.

        And these rules are known beforehand, LMG knew about the limits regarding user generated content, they just interpreted them different than Apple regarding how far they can go. Trying to stay just within hard to precisely define limits will occasionally cause apps to first be accepted, but later rejected again, which is of course annoying and expensive to deal with and can lead to developers stoping projects when that feature close to the edge of the rules is considered essential. But the exact same rules would have been applied to any other developer too, which is very different to how Meta or Sony handle it.

  • fcpw

    They’ve already said bluetooth controllers will be supported. So PS/XBOX controllers, triggers etc.

  • Well, RecRoom is compatible with almost every platform, so this doesn’t surprise me