Meta Connect 2023 has wrapped up, bringing with it a deluge of info from one of the XR industry’s biggest players. Here’s a look at the biggest announcements from Connect 2023, but more importantly, what it all means for the future of XR.

Last week marked the 10th annual Connect conference, and the first Connect conference after the Covid pandemic to have an in-person component. The event originally began as Oculus Connect in 2014. Having been around for every Connect conference, it’s amazing when I look around at just how much has changed and how quickly it all flew by. For those of you who have been reading and following along for just as long—I’m glad you’re still on this journey with us!

So here we are after 10 Connects. What were the big announcements and what does it all mean?

Meta Quest 3

Obviously, the single biggest announcement is the reveal and rapid release of Meta’s latest headset, Quest 3. You can check out the full announcement details and specs here and my hands-on preview with the headset here. The short and skinny is that Quest 3 is a big hardware improvement over Quest 2 (but still being held back by its software) and it will launch on October 10th starting at $500.

Quest 3 marks the complete dissolution of Oculus—the VR startup that Facebook bought back in 2014 to jump-start its entrance into XR. It’s the company’s first headset to launch following Facebook’s big rebrand to Meta, leaving behind no trace of the original and very well-regarded Oculus brand.

Apples and Oranges

On stage at Connect, Meta CEO Mark Zuckerberg called Quest 3 the “first mainstream mixed reality headset.” By “mainstream” I take it he meant ‘accessible to the mainstream’, given its price point. This was clearly in purposeful contrast to Apple’s upcoming Vision Pro which, to his point, is significantly less accessible given its $3,500 price tag. Though he didn’t mention Apple by name, his comments about accessibility, ‘no battery pack’, and ‘no tether’ were clearly aimed at Vision Pro.

Mixed Marketing

Meta is working hard to market Quest 3’s mixed reality capabilities, but for all the potential the feature has, there is no killer app for the technology. And yes, having the tech out there is critical to creating more opportunity for such a killer app to be created, but Meta is substantially treating its developers and customers as beta testers of this technology. The ‘market it and they will come’ approach that didn’t seem to pan out too well for Quest Pro.

Personally I worry about the newfangled feature being pushed so heavily by Meta that it will distract the body of VR developers who would otherwise better serve an existing customer base that’s largely starving for high-quality VR content.

SEE ALSO
Meta Plans to Blur Barriers Between App Lab and Horizon Store, Court Android App Developers

Regardless of whether or not there’s a killer app for Quest 3’s improved mixed reality capabilities, there’s no doubt that the tech could be a major boon to the headset’s overall UX, which is in substantial need of a radical overhaul. I truly hope the company has mixed reality passthrough turned on as the default mode, so when people put on the headset they don’t feel immediately blind and disconnected from reality—or need to feel around to find their controllers. A gentle transition in and out of fully immersive experiences is a good idea, and one that’s well served with a high quality passthrough view.

Apple, on the other hand, has already established passthrough mixed reality as the default when putting on the headset, and for now even imagines it’s the mode users will spend most of their time in. Apple has baked this in from the ground-up, but Meta still has a long way to go to perfect it in their headsets.

Augments vs. Volumes

Image courtesy Meta

Several Connect announcements also showed us how Meta is already responding to the threat of Apple’s XR headset, despite the vast price difference between the offerings.

For one, Meta announced ‘Augments’, which are applets developers will be able to build that users can place in permanently anchored positions in their home in mixed reality. For instance, you could place a virtual clock on your wall and always see it there, or a virtual chessboard on your coffee table.

This is of course very similar to Apple’s concept of ‘Volumes’, and while Apple certainly didn’t invent the idea of having MR applets that live indefinitely in the space around you (nor Meta), it’s clear that the looming Vision Pro is forcing Meta to tighten its focus on this capability.

Meta says developers will be able to begin building ‘Augments’ on the Quest platform sometime next year, but it isn’t clear if that will happen before or after Apple launches Vision Pro.

Microgrestures

Augments aren’t the only way that Meta showed at Connect that it’s responding to Apple. The company also announced that its working on a system for detecting ‘microgestures’ for hand-tracking input—planned for initial release to developers next year—which look awfully similar to the subtle pinching gestures that are primarily used to control Vision Pro:

Again, neither Apple nor Meta can take credit for inventing this ‘microgesture’ input modality. Just like Apple, Meta has been researching this stuff for years, but there’s no doubt the sudden urgency to get the tech into the hands of developers is related to what Apple is soon bringing to market.

A Leg Up for Developers

Meta’s legless avatars have been the butt of many-a-joke. The company had avoided the issue of showing anyone’s legs because they are very difficult to track with an inside-out headset like Quest, and doing a simple estimation can result in stilted and awkward leg movements.

Image courtesy Meta

But now the company is finally adding leg estimation to its avatar models, and giving developers access to the same tech to incorporate it into their games and apps.

And it looks like the company isn’t just succumbing to the pressure of the legless avatar memes by spitting out the same kind of third-party leg IK solutions that are being used in many existing VR titles. Meta is calling its solution ‘generative legs’, and says the system leans on tracking of the user’s upper body to estimate plausibly realistic leg movements. A demo at Connect shows things looking pretty good:

It remains to be seen how flexible the system is (for instance, how will it look if a player is bowling or skiing, etc?).

Meta says the system can replicate common leg movements like “standing, walking, jumping, and more,” but also notes that there are limitations. Because the legs aren’t actually being tracked (just estimated) the generative legs model won’t be able to replicate one-off movements, like raising your knee toward your chest or twisting your feet at different angles.

Virtually You

The addition of legs coincides with another coming improvement to Meta’s avatar modeling, which the company is calling inside-out body tracking (IOBT).

While Meta’s headsets have always tracked the player’s head and hands using the headset and controllers, the rest of the torso (arms, shoulders, neck) was entirely estimated using mathematical modeling to figure out what position they should be in.

For the first time on Meta’s headsets, IOBT will actually track parts of the player’s upper body, allowing the company’s avatar model to incorporate more of the player’s real movements, rather than making guesses.

SEE ALSO
The Sequel to One of Quest's Best-rated Games is Now Available, Trailer Here

Specifically Meta says its new system can use the headset’s cameras to track wrist, elbows, shoulders, and torso positions, leading to more natural and accurate avatar poses. The IOBT capability can work with both controller tracking and controller-free hand-tracking.

Both capabilities will be rolled into Meta’s ‘Movement SDK’. The company says ‘generative legs’ will be coming to Quest 2, 3, and Pro, but the IOBT capability might end up being exclusive to Quest 3 (and maybe Pro) given the different camera placements that seem aimed toward making IOBT possible.

Calm Before the Storm, or Calmer Waters in General?

At Connect, Meta also shared the latest revenue milestone for the Quest store: more than $2 billion has been spent on games an apps. That means Meta has pocketed some $600 million from its store, while the remaining $1.4 billion has gone to developers.

That’s certainly nothing to sneeze at, and while many developers are finding success on the Quest store, the figure amounts to a slowdown in revenue momentum over the last 12 months, one which many developers have told me they’d been feeling.

The reason for the slowdown is likely a combination of Quest 2’s age (now three years old), the rather early announcement of Quest 3, a library of content that’s not quite meeting user’s expectations, and a still struggling retention rate driven by core UX issues.

Quest 3 is poised for a strong holiday season, but with its higher price point and missing killer app for the heavily marketed mixed reality feature, will it do as well as Quest 2’s breakout performance in 2021?

Continue on Page 2: What Wasn’t Announced »

1
2
Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Andrew Jakobs

    Again, the video for IOBT with the article is not showing what it actually does for upperbody/elbow/arm tracking like anothe video on the Meta developer youtube channel. Legs are just a part of it, but the upperbody tracking is really interesting. Tried to post the link to the youtubevideo with the original article but it was never approved.

    • Arno van Wingerde

      I wonder if a few very cheap motion sensors with a band around the leg at knees and feet would not help the “problem’ for those who need it, e.g. for a dance app.for me putting them on would be too much effort in most case, but the cost should be minimal, if you just want to track 4 points.

      • There is the Sony Mocopi which offers FBT via this approach, but it has drift problems from what I’ve heard.

        Eons ago there was this really cool project called TelariaVR. It used a strap, IMU, and pressure sensor on the bottom of the foot, which allowed for some really cool interactions. But it never went anywhere unfortunately. Here’s the proof of concept for it:

        youtu . be/hVZZ9JntP3o?si=PqXYUS_7kWu6NaQ5

      • Christian Schildwaechter

        TL;DR: Meta’s software based FBT will most likely help those using hardware based tracking more than hardware tracking will help those those using Meta’s software based FBT, but effectively almost everybody using VR will significantly benefit from FBT getting more support and out of its niche.

        Adding cheap motion trackers probably would help with apps that already support FBT, but there are very few of these to begin with. The problem with current FBT is friction and default configuration. There are actually a lot of solutions available for many problems, but whenever you increase complexity and cost, you lose a lot of users. With all hardware based full body tracking systems, you have to make some significant investment in time and money.

        Adding Vive or Tundra trackers is the closest to a plug-and-play solution, where all you have to remember is charging the trackers, but you have to use lighthouse and spend about USD 125 per tracking point. Sony’s Mocopi is quite user friendly in combination with their phone app for about USD 75/tracked point in their standard six tracker config, but as a IMU based system, you have to regularly recalibrate it. The comparable SlimeVR core set comes down to USD 32/tracked point, with not so polished software and the same IMU issues, but a lot of community support. And you can push that down to USD 15 by building them yourself and pick the cheapest components, plus a lot of work that can maybe justified as a learning experience, but not as a money saving measure when applying minimum wage.

        Only very few users interested in FBT will be either willing to effectively spend hundreds of dollars and/or varying amount of time for setup and repeated recalibration. And of those a lot will still get annoyed by having to strap several controllers to their limbs instead of jumping into VR within a few seconds. Consequently there is little initiative for developers to integrate FBT into their apps or games to appeal to these few users, and even less initiative to rely on it for game mechanics

        This causes a typical chicken and egg problem, and condemns the tech to a niche. That’s even true for optional peripherals from the original manufacturer, e.g. the Gear VR initially relied on a small trackpad on the HMD, and only later versions got an additional 3DoF controller. But since the largest part of the user base didn’t have one, most apps never supported it.

        Making FBT (approximation) part of the default configuration on Quest via software solves a lot of that. For one there is now a default option to integrate FBT into apps, so a lot of developers will start to experiment it, even if it turns out to be inferior to current solutions. Once it gets integrated into more apps, SlimeTracker etc. should see a huge boost even though there is now a free alternative, simply due to there now being more uses cases. And the hardware trackers probably still significantly improving the tracking parts where Meta has to rely on smart guessing, or provides an upgrade option for Quest 2 users stuck with only guestimated virtual legs. Integrating it in a useful way into games will take much longer, as a lot of it will work best or only on Quest 3, and Quest 2 will dominate the install base for a long time.

    • Christian Schildwaechter

      I guess you are referring to “Get Moving: the Latest from Movement SDK” youtu_be/B-pN-UzpnT4 . It’s an interesting 30min video, though clearly targeting developers and rather slow paced, and end users would probably prefer a 2min cut only showing the relevant demonstration parts.

      It is very unfortunate that ad based monetization now drives sites to make users stay as long as possible instead of referring them to other/original sources, and toss out links in comments to discourage spam. Both break one of the fundamental principles of the web, the linking between relevant information as a quick way for people to get a more detailed picture, based on their own interest and needs. Instead we now have to rely on workarounds like misspelled URL to evade spam filters, and while I usually despise someone replying with “google it”, it is at least an option, as long as the title of the relevant page or video is included as a reference.

      The term micro-transactions is now mostly associated with predatory gaming monetization, but the original concept of the WWW by Tim Berners-Lee fro 1989 included not only links, but also back-links and true micro-transactions, to allow content creators to get payed for the information they provide, instead of having to drown everything in ads. And I often wish both the back-links and monetization had been properly established. I would be very willing to pay a few cents to access information that takes me a hundred times more in time to read when just applying minimum wage, if I could get directly to the relevant information instead of wading through tons of links to other articles, unrelated videos, walls of ads and now also AI generated fluff first.

      Just being able to see which other sources have linked to an article could get rid of a lot of clickbait and spam, and make linking to external sources more attractive to reputable news sources, as users would use them as a filter to get to the good stuff. Without a standardized option for (sub-)cent transactions, we now have a web with less useful connections and a lot of extra effort for sorting through the increasingly aggressive layers added to pay for the creation of “free” information. Or monthly subscription that only make sense if you only use a few information sources, but not for accessing a large network of connected information, as the web was intended to be.

  • Ad

    Hard not to be extremely pessimistic about the future of this industry.

    • The XR industry in the sense of productivity features in businesses, education and military/aviation will be doing fine.
      It’s the game and entertainment sector that’s still wonky.

      • Lucidfeuer

        Military yes, businesses and education what have you been smoking?

    • shadow9d9

      Considering how incredible pancake lens clarity is, combined with the potential for xr, plus ringless controllers, plus A1 for wireless pcvr streaming.. VR is more exciting than ever before.

    • Arno van Wingerde

      Sure the turn over went up by a measly order of magnitude in 3 years, from 6M$/month to 60 M$/month is a clear sign the industry is dying! You are one of those half-empty types, I guess?

      • Traph

        60 million bucks a month – impressive, very nice.

        Now let’s see Reality Labs’ 2022 operating loss.

        Yes I realize this is not an apples to apples comparison, so please hold back the “um ackshually”. The larger point is that the Oculus VR market is heavily warped by Zuck Bucks and it’s astonishing to consider that another order of magnitude increase over the next three years would still put Meta ~10-15 years in the hole just from 2022 losses alone.

      • Lucidfeuer

        You clearly don’t work in this industry

  • Dragon Marble

    I don’t think Meta’s MR push is a “response” to Apple. These things take years to develop. It’s just that both companies see the same potential, and the technology is now good enough. Apple seems to be all in on MR while Meta is still testing the water.

    • Christian Schildwaechter

      I don’t think this was meant as in Meta started developing tech to react to Apple. We don’t even know what they spend the USD 10bn a year on MRL for, but it is safe to assume that they have a huge number of projects we have never heard of, only a few will ever make it to market.

      So at this point a reaction to Apple is less a new development and more picking things they already had developed internally, like the micro gestures or Augments and making them publicly known. Which is fine in principle, a smart move for PR to prevent anybody pointing out features that AVP might have and the Quest 3 might not, and also cheap, as the very likely already existed somewhere at MRL.

      The problem is the lack of consistent UI philosophy. I have no doubt that they had Augments for a long time, as those are mostly an extension of the spatial hooks already available on Quest 2 for hanging virtual pictures to your MR walls. I was still somewhat baffled by them presenting the concept, as it mostly makes sense for an HMD where passthrough is basically always on, even during the use of applications. Which is the case for the AVP, where we haven’t even really seen apps using their full “immersive mode” VR. But the Quest 3 first and foremost runs the same software as the Quest 2, and the UI is used mostly as a launcher to start apps taking over the whole view. An Augments clock or chat window or weather widget positioned in MR is of only limited use on Quest, so why is this feature getting so much exposure now, long before there is a reason for users to really stay in MR on Quest?

      Meta will probably be able to counter every AVP feature with something very similar from their own labs, but this may actually backfire. Apple is often criticized for not having certain features that e.g. Android offers, and it is regularly a deliberate design choice. They aim for a very consistent set of core features that will work flawlessly together in a way that intuitively makes sense, and things that don’t fit get cut, even if the users complain. That can be very annoying, but is the basis for their high scores in usability.

      In contrast Meta is already somewhat know for their inconsistent UI/UX, with several reasons for that mentioned in the article. If instead of first fixing the base, they now start throwing extra features like Augments for the still unproven MR on top of that, or micro-gestures without having the eye tracking based UI that Apple uses them for, they’ll just make it more confusing.

      A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away.

      Antoine De Saint-Exupéry

      • Dragon Marble

        There are at least two reasons to stay in Q3’s MR home: watching movies or playing Xbox Cloud games — especially if you want to do it together with someone sitting next to you.

        • Arno van Wingerde

          I already have a device that can do that an OLED tv that blows the Quest3 away. AFAIK it can also do Xbox cloud.

          • Dragon Marble

            Not in 3D. Also, the TV I want is too big for my room.

          • JanO

            To my understanding, games will only display on a 2D virtual screen…. Where did you get that this involved actual 3D gaming?

          • Dragon Marble

            I was talking about movies. I don’t know about games. But they should add 3D support. Otherwise, I can’t really think of a flat game I want to play.

      • One thing that people very often overlook is the „it just works“ mentality of Apple products. They don’t need to be first. They don’t need to be affordable (at least in first gen products of a new category). They just make the overall sum of its parts better than the competition. That’s where the AVP will fit right in.

      • Arno van Wingerde

        I agree with your observation that the author portraits this as a reactive to Apple’s Vision Pro. It’s not like Mark saw the announcement and quickly included those into the Quest3…
        And the Quest3 really ist the first MR device, affordable or not, after the all the Vion Pro is not available, whereas the Quest3 is… so the “affordable” part could also be a reference to the Quest Pro…

      • Lucidfeuer

        The sole fact they probably copywrote this “MR” thing as a reaction to Apple’s headset showed how beyond, disgustingly, mediocre and illegitimate Meta is. Augment’s as an obsolete marketing artifice show they’re going nowhere.

        I don’t even think Apple’s markets the Vision as “MR” and seem to actually know better than to do so.

  • That is a great summary! Thanks for putting in your thoughts and comparisons to the AVP, especially in regards to the cursed UI.
    Yes, the air is getting thin for Zuckerberg. We will see Apple, probably Valve and also Samsung come swinging at them by the end of next year.

  • Dawid

    I am concerned about the Ouest 3 optimal head strap position. I have seen that in case of many people it is pushing and folding their ears. Even Mark Zuckerberg himself has this issue. I hope it is only incorrect adjustment and not a design fail.

    • Dragon Marble

      What I found (on Quest 2) is that you just need to push the interface (the hard part of the soft strap) up a little. It doesn’t affect the comfort at the back. People seem to forget that it is a soft strap, and can be bended.

    • Octogod

      It has my favorite strap yet. It’s soft, yet easy to get a snug fit. I wouldn’t worry.

  • This is a great article, but I think it misses two things:
    1. The announcement of the Ray-Ban Meta
    2. The big attention Meta had towards AI. The Quest was a side, while main dish was Meta AI, LLAMA2, and the integration between AI and the Ray-Ban smartglasses. This shows a lot what are Meta’s priorities at the moment

    • Steve R

      Agree, the new Ray Bans with AI assistant are a big deal.

      Yes, they are (correctly) giving a lot of attention/priority to AI. But Quest was still announced first in the keynote (they could have led with AI). I think XR is still getting plenty of attention/priority.

    • Christian Schildwaechter

      I wonder how much of the focus on AI was actually for investors. Connect is an XR developer conference, so the emphasis on AI was somewhat odd. Not the use of AI, which for years has driven a lot of Meta’s XR research, like with the predictive body tracking or integrating neural chips into SoCs to power life-like mapping of scanned user faces onto generic avatars with very low power requirements

      AI/machine learning now drives a lot of tech, because it can be a lot cheaper to train a network to correctly guess a result, than running hardware actually calculating it, which will help a lot in the future with e.g. rendering high resolutions. But that happens mostly in the background, and companies like Apple almost never explicitly mention it, it is just an implementation detail of new features, like the API or language used. Whenever it is emphasized, it is usually for marketing purposes, because everybody saw the results of ChatGPT and therefore now knows that this is “the future”.

      The same is probably true for Meta, who’s massive spendings at MRL have drawn a lot of criticism. So now they attached the more trendy AI to the apparent money pit XR, to make it look more appealing to their investors. And actively avoided the term “Metaverse” due to it now being mostly associated with a sort of overhyped cloud-cuckoo-land. But I doubt that their strategy has really changed all that much. Becoming a/the dominant player on the potentially dominant medium of the future is still the target and will still take decades. AI has always been a part of the tool box and Meta has released several very powerful open source products over the last few years, including the LLMs currently en vogue, and published research papers on how to use them not just to create text or images, but whole worlds as 3D geometry and populate them.

      It’s just that the public all of a sudden got aware that more AI is coming, and Meta is riding the wave of public interest to get investors to still allow them to burn through USD 10bn a year at MRL, because now it all will use AI, and that’s all they really want/need to know. The users will only notice the effects, like their legs staying in plausible positions instead of flailing around, and can blissfully ignore whether this is due to better camera tracking, smarter IK, improved apps, machine learning, or a combination of all of them.

    • Lucidfeuer

      Nobody cares about Ray-Ban Meta, it’s not like the Spectacle made any waves, anywhere. And yet this might, as you said, have been the only interesting announcement of Connect.

  • Steve R

    In one of the sessions they announced that mixed reality passthrough will be the default mode.
    IMO this is very big for usability.

    • Octogod

      Do you know which one?

      • Steve R

        Unlocking the Magic of Mixed Reality. Time 3:30 on the Youtube version.

        • Octogod

          Thank you!

  • Arno van Wingerde

    Hm… my definition of augments: “virtual junk in the living room”.

  • Christian Schildwaechter

    It’s often sad to see how much further VR could already be, if it would have attracted a larger user base. When Valve/HTC introduced lighthouse tracking in 2016, which is based on a lot of small, rather simple sensors, they expected the cost to significantly drop thanks to mass production and economies of scale. Had this worked out, we would probably have sub-millimeter precision trackers for USD 10 or less by now, and attach them to everything from arms and legs to hats, ping pong paddles, coffee mugs and pets. And there would never have been legless avatars that finally got approximated legs for standard movements with years of delay.

    Instead the number of VR users stayed rather small, with ways too many even leaving after a short while. And we only get a rather limited selection of either cheap, effectively subsidized hardware from big companies with a lot of strings attached, or high margin small scale production products, often targeting business customers with matching prices. Thanks to open source you can now build your own simple eye tracking solution from individually sourced components for less than USD 25, which is a lot more than integrating the same features into a mass produced HMD would cost, while a commercial eye tracking module that should require even less hardware than the DIY solution comes at five times its price, if you can get it at all.

    I was so convinced in 2014-2016 that VR would take the world by storm and sell in millions, quickly driving down cost. Had this happened, we would now all use FBT in much more extensive and complex virtual worlds with much higher visual fidelity thanks to ETFR and more developer interest. I am fully aware of the many problems VR still has, and even a huge success wouldn’t have significantly sped up SoC, display or lens development due to fundamental problems. Nonetheless I’m still trying to figure out where it all went wrong, and how we ended up in a slow paced and expensive niche, despite of all the amazing possibilities the technology can provide.

    • Guest

      It went wrong from repeating history and it none of the tech giants have learned from it.

    • XRC

      Its my second go round with VR, having realized the huge potential during my interaction with Virtuality in ’91-92 whilst a student studying industrial design. We had some early industrial headsets running on our silicon graphics workstations at university.

      Since getting Vive Pre in 2016 I’ve been super impressed with the equipment’s ability to generate presence, despite current limitations, and for me the steamVR tracking is key ingredient.

      As esteemed Harvard professor Jeremy Bailenson said in a recent interview when asked about the five most important aspects for presence, “tracking, tracking, tracking, tracking, tracking”.

      Also thought it would become more popular, but here we are in ’23…

  • Octogod

    Well said.

    More dire, while the drop from $60m to near $40m was in a year, it was also with a much higher number of apps in the store and in App Lab. So the average ROI on Quest games has dropped massively.

    Meta hasn’t learned that ‘Move Fast and Break Things’ doesn’t work when that is UI/UX on your face. People don’t adapt to the new interface, they just don’t engage with it at all. It has the opposite effect of increasing engagement.

    And Connect had a handful of MR demos, but 3 of the 5 were FPS wave shooters. They were the only ones I saw people stop playing mid sessions and go “we got it”. It’s clear the tech is there, but even with a year and half of developer exploration, the experiences are not.

    I’m bullish on Quest 3. But Meta needs to focus on why people buy the headset, not on winning a war with Apple.

  • Cragheart

    I am all for stylized, not fully realistic graphics, but I think that current “Metaverse version 0.01” looks just not good enough. I am not expecting 100% realism or anything like that, but Horizon Worlds needs a significant graphical update and an update to the maximum number of people gathered in one place at the same time. It’s rather cringy and weird at the moment imo. VRChat seems to make more sense in 2023.