Epic Games today announced that Unreal Engine 5 is now available in early access, including a new template for VR projects built on OpenXR 1.0, the industry-backed standard for building device-agnostic VR and AR applications.

Epic Games is bringing heaps of new features and improvements to the latest version of its game engine, Unreal Engine 5. While some of the biggest features—like the Lumen lighting system and Nanite mesh system—don’t yet support VR, today’s early access release of UE5 includes a rebuilt VR template built on OpenXR.

The VR template in Unreal Engine 5 is designed as an ideal starting point for VR projects. It has basic and extensible functionality for VR locomotion and object interactions built right in.

And because it’s built on OpenXR, developers shouldn’t need to worry about as much headset-specific configuration if they’re aiming to support multiple headsets. The system also supports OpenXR extensions for adding vendor-specific functionality.

The template configures Unreal Engine 5 with settings specific to VR, like automatically disabling the Lumen feature (because it isn’t yet supported for VR headsets).

SEE ALSO
HTC Teases Reveal of "game-changing VR headsets" at VIVECON

Epic says it’s “highly recommended to create your VR project using the VRTemplate in UE5,” and notes that the OpenXR VR template supports the following VR platforms:

  • Oculus Quest 1 and 2
  • Oculus Quest with Oculus Link
  • Oculus Rift S
  • Valve Index
  • HTC Vive
  • Windows Mixed Reality

Epic says that the early access version of Unreal Engine 5 isn’t yet suitable for production, but developers looking to experiment can download it today through the Epic Games launcher and can even get access to the UE5 source code on GitHub. The full production ready version of Unreal Engine 5 is expected to release in early 2022 with additional features and improvements.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


  • BonWOLF

    Missed the best part. 1080p Input = 4k output! “)

    • kontis

      Welcome to the year 2021 where native rendering is a thing of the past.
      Total resolution of human eye is similar to the number of pixels a 4K display has.
      It had to end sooner or later, real-time rendering is all about cheating or we would be still playing pong.

      • xyzs

        Man, stop spreading wrong infos. Human eye is way above 4k….

        I have 5k screen in front of me that takes maybe half of my fiel of view and I see the aliasing of the pixels clearly.

        If I recall correctly, a study a few years ago concluded that it is around 11 or 12k per eye that is needed to have the pique of 10/10 human vision.

        • guest

          Yeah, cones and rods are analog inputs, but the detail that get to the brain are magnitudes higher in digital terms.

          • silvaring

            I’ve read that the human eye can resolve around 130million pixels (130 megapixels – with only 6 of those megapixels seeing in color) at a speed of around 200hz? That’s a 16k display refreshing at very high speeds. A Windows Mixed Reality headset only does around 4 million pixels at a rate of 90hz. That’s still far off isn’t it?

          • Bob

            There are no frame rates in real life but if we were to apply this to what our eyes see in the physical realm it would amount to pretty much infinite depending on the individual. For this reason, it’s not possible to perfectly resolve motion and we are definitely nowhere near getting close to how our eyes perceive motion in real life through the technology of today or even the near future.

        • Cdaked

          At least 60 ppd (pixels per degree).

        • Bob

          “Man, stop spreading wrong infos. Human eye is way above 4k….”

          I think he was being sarcastic here.

  • Cdaked

    No Lumen, nor Nanite in VR… Disappointing.

    • PK

      it’s only a matter of time, epic seems to have big plans for the metaverse, and if even a small percentage of nanite’s abilities are usable with whatever headsets are available next year, that’s a massive performance improvement. i wonder how much this could translate to networking many more people in the same space. might still be severe limitations but this seems like it could help immensely, even if it’s over a year away.

      • Epic’s “metaverse” just refers to their ecosystem between Fortnite, its in game promotion/event/whatever crap, the other F2P one trick pony crap they’ve acquired lately, their Epic Games Store that they want to shove on every device independent from first party console stores, and even inside those stores, etc., that they used a dumb term like “metaverse” for it doesn’t mean they’re building Ready Player One’s VR universe for real.

        • PK

          i think you’re greatly downplaying how integral unreal could be to the future of social vr. sure they’ll probably want to build on their fortnite universe, but they’re also big on empowering others using their engine. and if unity doesn’t keep up i’ll have to switch at some point if i want to really impress people. hopefully though there’ll be healthy competition, and epic holds up their promise to connect different types of virtual worlds, although it may only end up extending between unreal users.

      • kontis

        it’s only a matter of time

        Tell that to all the VR devs who were waiting for years to fix basic features not working in VR with Epic not caring at all.

        • Andrew Jakobs

          Well, what’s stopping you from trying to fix it yourself and then submit the fixes, you’ve got access to the complete sources…

        • PK

          i’m curious what some examples are of issues they’ve ignored? i know there’s something that’s held social apps in vr back from using it but not sure what those are. still, i assume once they get fully on board that Epic will fix this.

          • Jessica Lyons

            Norma this-past month I collected $19562 through working on pc at home in my spare-time..(w1220) I figured out to do it by spending some hours in whole day consistently-computer.(w1220) it’s very nice and any of you definetly can get this. >>> http://ncu.one/aE
            .

          • SchoolHomeVR Technology Assist

            VR Editor has not seen an update or fix in 4-5 YEARS. Certainly no new tools or features. I have asked to simply be able to edit SPLINES (Even just Landscape Splines) in VR Editor every years since it came out,. Still nope, I love the thought of building a miles long river in a Terrain in VR where I can do it, in an immersive way. Carve out waterfalls with the Terrain Sculpting tool (That works fine in VR Editor,..!) but no spline work. Even on youtube, you look up VR Editor, the last video from them was 3-4 years ago!

      • Phyllis Dvorak

        Lyla previous six weeks l earned $19562 by working on my computer staying at my home in my free hours.(r1289) I’m able how to do it by working few time in whole day with my laptop. Realy it’s very easy & everyone definetly can join this work. >>> http://www.Belifestyles.com

      • Cdaked

        I wish! I’m only interested in VR development at the moment, so I’m starting with both Unity and Unreal Engine to decide between the pros and cons, but from people I know at Epic, it doesn’t look like Nanite or Lumen will be available for VR for quite some time, if ever.
        I also heard from a senior Epic manager that the issue of taking eye-tracking into account was not a priority, so it doesn’t sound like Epic is betting too much on VR.
        Nanite and Lumen require a single camera and high-speed disk access, so it’s more for consoles like Sony’s PS5, which, remember, has invested hundreds of millions of dollars in Epic over the last two years.

        • Andrew Jakobs

          Well, one shiny point might be PSVR2…

      • Cdaked

        As for Epic’s metaverse, it should be noted that there is no social VR platform that uses Unreal Engine: they all use Unity.

        • PK

          I thought I heard something surprising once about Facebook using Unreal for at least one of their social vr attempts, but I’m not positive about it. As for others, some user their own! The smart thing was to use Unity because it’s more modular, people can do a lot without learning an entirely new tool, although there is still a lot of work to do making an sdk work with it, especially if you’re allowing your community to share amateur works. As vrchat has done, they put a ton of time and effort and years of shared testing in, and it’s paying off now. That said, of the artists I collaborate with for vrchat projects, like half of them already use Unreal so once a platform fully embraces it I don’t think it’ll take all that long to build up.

        • Elite-Force_Cinema

          *Tencent’s metaverse

      • Matilda Cobb

        Norma this-past month I collected $19562 through working on pc at home in my spare-time..(w58) I figured out to do it by spending some hours in whole day consistently-computer.(w58) it’s very nice and any of you definetly can get this. >>> https://trimurl.co/oxMwJZ

    • Elite-Force_Cinema

      And disappointing means you don’t like it! Get your facts right on what the word disappointing means you clown!!!!!

  • Elite-Force_Cinema

    Source 2 is better than this communist Chinese game engine! Change my mind! Oh wait. You can’t! And that’s sad… for you Epic shills!!!

    • kontis

      Source 2 was supposed to be available to devs 5+ years ago. It looks like it never will be.

      So you are comparing apples and oranges, Source 2 is not a tool that can be sued to make games. It’s a Valve’s only toy and they rarely make games, so it may as well not exist.

      • Cdaked

        And Godot? v4.0 could be interesting.

      • Do you know more about Source 2, in comparison to other engine? Is there a way to compare function and performance?

        • Elite-Force_Cinema

          Yes. And it is very simple! It is made by Valve, not the Chinese overlords at Epic Shit and Tencent who bend their knees to China like Blizzard Entertainment and Activision!!!

  • kontis

    Ben, if you ever have an opportunity to interview Epic it would be really nice if you could ask them a difficult but real question:

    Why are they neglecting VR? Even basic features are broken for years and they don’t fix them. Many VR devs give up on Unreal and go back to Unity.

    https://forums.unrealengine.com/t/niagara-render-broken-in-vr-with-instanced-stereo-on/131049

    • aasdfa

      im one of those devs… i was all over their forum posting about issues in unreal for vr and some never got fixed.

    • xyzs

      Answer : money, as always.
      The day VR will be a very significant share of their income, they’ll fix them.

  • Woah!

  • It would be hard to imagine, for Quest 2 development anyways, anything being better then 4.26. The file sizes fell DRAMATICALLY, while the frame-rate improved. After years of bloat they finally trimmed back the useless stuff in mobile packages.

    A new engine is going to be full of new features, which will take up more resources even if not used, and will likely be more buggy in the short term. Epic has a history of overbuilding things and adding to bloat. Might be best to shy away from UE5 for a few generations and feel it out before moving to the new engine.

  • xyzs

    I hope all engines soon integrate an equivalent of Nanite and Lumen.
    No more LODs (and no more wasted time creating those, no more disk space wasted storing multiple variation of the same assets)
    No more Lightmaps (and no more wasted textures creating those, no more baking task for studios)
    More simplicity for game artists, more performances, and lighter games, all at once.. I want this everywhere.

    PS: Godot has already a Lumen like technology called SDFGI for upcoming 4.0, I hope they now create a Nantie equivalent :)

  • Schadows

    Considering the poor state of the early access right now (crash fest), I’m happily surprised they already included a working VR template. I would have thought it would be implemented later in the dev cycle.

  • oomph2

    Exciting.
    They should also have AI based scenery generation.
    If I type ‘fill tundra vegetation’ then entire(or selected) area should get filled automatically by this type of vegetation(instead of pick & choose one by one arg) .
    Character generation can be similar to TES skyrim