‘Battlefield V’ Shows What NVIDIA RTX Ray Tracing Means for the Future of Gaming

51

Ray tracing simulates the way that light bounces around a scene and interacts with objects before entering our eyes to create the image of the world around us. It’s the kind of CGI rendering approach used in blockbuster movie graphics, and the same technology that allows architects to create truly photorealistic images of buildings which haven’t yet been built. But for a long time ray tracing has been far too slow for practical use in gaming, because where complex movie CGI might take minutes or even hours to render just one frame, the gaming baseline is about 30 frames per second. Thanks to NVIDIA’s GeForce RTX GPUs, Battlefield V shows what real-time ray tracing capabilities mean for gaming graphics.

A major feature of NVIDIA’s new RTX GPUs is hardware which accelerates ray tracing operations to the point that they can be practically used in real-time gaming. Ray tracing can be used for a wide range of enhancements by accurately simulating everything from reflections to shadows and even soundwaves.

Real-time reflection of an explosion in the eye of a Battlefield soldier. | Image courtesy EA DICE, NVIDIA

This week at NVIDIA event ahead of Gamescom in Germany, the company revealed their RTX GPUs and spent much of their presentation explaining why they call ray tracing the “holy grail of computer graphics.” Among the examples in the presentation was a particularly illuminating segment showing how Battlefield V developer DICE has incorporated real-time reflections into the game which are only possible with ray tracing.

Ray tracing is a much more accurate way of rendering computer graphics compared to raster-based approaches, around which a wide range of tricks and hacks for faking complex visual phenomena—like reflection, transparency, lights and shadows—has evolved. As it turns out, those elements are a major factor in what passes as “real” to our brain, and the hacks aren’t always convincing.

Incorporating ray tracing into real-time gaming opens the door to subtle but powerful visual cues that make it easier for our brain to accept that what we’re looking at is real. In the long term, this will play a major factor in the level of immersion that can be achieved in VR.

SEE ALSO
GeForce RTX Cards Announced with VirtualLink VR Connector

Today however, a day after NVIDIA’s GeForce RTX announcement, it isn’t yet clear how complex ray traced rendering powered by RTX cards, like the real-time reflections demonstrated in Battlefield V, will fare against the high demands of VR headsets which require exceedingly high framerates and resolutions. We’re hoping to learn more about the applicability of the ray tracing capabilities of the RTX cards to VR this week during Gamescom.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Suitch

    “gaming baseline is about 30 fps” my ass. Remember your audience when writing an article. We game at a baseline of 90fps here. (Sarcasm)

    • In fact at the end he says that it has to be evaluated the peformances for VR

  • VR4EVER

    I can hardly see any benefit for VR in terms of Raytracing. Better AI, fresh ideas and a new take on the medium is whats needed, not how realistic the explosion reflects in a window. That comes second at least in my book. If VR is even capable of running such stuff @90fps, at least…

    • JJ

      you are stupid. I’m not even going to explain why cause theres no point

      • VR4EVER

        Dude, kontis did that perfectly. Btw. thanks for calling me stupid, you must feel better now.

        • Marc-André Désilets

          Kontis facts are wrong don’t use them as your baseline for understanding of what Raytracing is. Not only ray tracing would be a mess in VR for multiple reason (including different sampling per eye and ai interpolation) But since it’s based on camera position everything would be rendered twice without possible stereo instancing)

          • silvaring

            AI interpolation is getting better in leaps and bounds though dude. It’s how Nvidia are experimenting with foveated rendring algorithms that correct shimmering artifacts which wasn’t even a thing just three years ago.

          • Marc-André Désilets

            Yep but most of this development is based on new AA technique not necessarily raytracing techniques.

        • JJ

          Yeah Kontis was on and off with his facts, He included some wonky ones and left out some of the best benefits.

          The biggest non performance benefit to me is just the fact that all the light data is REAL. its not guessed cube map reflections anymore its literally following the path of what a light ray would be doing in the real world and this gives us soooo many more options than we had before.

          Most of which we haven’t even begin to work with because the tech is new for real time environments.
          The reflections in the demo were the flashy new cool piece while the shadows and ambient lighting were the utilities and base for whats to come. They wernt really too special because at this point ray tracing computing is on par if not slower than previous methods BUT the amount of rays we can trace is multiplying at a high rate and within very short years we will be seeing some very very cray shit.
          plus right now ray tracing is too expensive for HMDs duel monitors but soon ray tracing will pass pascel and resaturastion methods and make VR run smoother and more efficient.

          Its a slow process and starts out weak but they wernt kidding when they said raytracing is the holy grail of graphics.

      • Raphael

        You seem to have reached the same conclusion as me. This happened recently on facebook when a VR hater listed reasons why VR was no good… reasons: “have to drink through a straw”, “have to remove headset before going to the kitchen”. I explained to him that I couldn’t be bothered to explain why he’s an idiot. I explained that he was too far gone…

      • fdf

        He may be wrong for the benefits of Ray tracing but it’s true that VR as a whole is lacking in originality. You get a couple good games but nothing “superb” killer-app worthy. There isn’t a game that still stands out when someone mentions VR. VR needs content NOW, visuals later.

        • JJ

          what? thats not the discussion were having here and even then there are killer games whether you agree on them or not. Beat saber, vrChat, Onward have all made a splash in the gaming industry.

    • kontis

      You don’t know what raytracing is, but there is some Nvidia’s fault here (their presentation wasn’t good – it’s not just about some cool effects). It already revolutionized movie VFX/animation industry in this decade. Here are some of the aspects that will affect VR in the future:

      – shorter pipeline resulting in lower latency

      – rendering R,G,B channels separately resulting in perfect, free and faster chromatic aberration correction

      – perfect lens matched rendering for any FOV (even 360 deg) that is impossible in rasterization, completely removing a need for distortion correction

      – many, many times better scaling and greater efficiency in foveated rendering

      – perfect scaling with multiple GPU and future MCM chiplet GPUs – again, not possible in rasterization

      • Zerofool

        What they announced yesterday is a hybrid pipeline which combines rasterization with ray-tracing for particular calculations like GI, shadows, reflections, area lights, etc. Also, AI is thrown in the mix for denoising the ray-traced results due to the low number of samples per pixel. They are not throwing away rasterization entirely any time soon. So the pipeline is actually more complex now, not simplified. Take a look:
        https://www.twitch.tv/videos/299680425?t=02h22m33s

        All the things you listed are great and will come with truly RT-only pipeline, but it will happen in few GPU generations at the earliest. No sane developer would make a game if it only runs on Nvidia cards (3 of their cards to be exact), so AMD needs to offer a competitive chip with DXR support first, which I would be surprised if materializes before 2020.

        • kontis

          Sure (this is why I added “in the future”), but without that transitional period (hybrid) we would never have full path-tracing, so it’s a very important, necessary step.

          OTOY already achieved 4K60 FPS in Brigade on Turing via Vulkan. It’s still a bit too noisy, but 4K = 8 mpixel ~= resolution of a human eye, so we are getting closer than some people think.

          • Zerofool

            I agree, it’s a necessary step. I just wish they focused on an industry standard like DXR or Vulkan RT, and not a proprietary crap. History shows that none of these ever get wide support or stick long-term: 3D Vision (twice), TXAA, PhysX, Ansel, VXGI, VXAO, Hairworks and other Gameworks effects. I don’t see why it will be any different with RTX.

            >> OTOY already achieved 4K60 FPS in Brigade on Turing via Vulkan
            Source? I’d like to know more. Last info I have is from their panel at the Unity booth during GDC.

      • VR4EVER

        Thanks I know what raytracing is. What I didnt know where the points you listed above, thanks for that. Sadly they focused mainly on raytracing.

      • Mike

        Didn’t revolutionize movie graphics this decade – it was 20 years ago with Toy Story. It’s just taking realtime graphics a REALLY long time to catch up.

        • Marc-André Désilets

          Toy story was not using ray trace, the first movie pixar made fully using raytrace and global illumination was monster university.

          • Mike

            Oh, interesting, I always thought Toy Story used ray tracing. Though actually, I looked it up, the first Pixar movie to use ray tracing was Cars (2006).

          • Marc-André Désilets

            Yes cars was using some raytracing features like reflection, ao and shadows even if it was still using a ton ofs lights/point lights to fake global illumination ;)

      • Trip

        Thanks, this somewhat answers what I was about to ask: “Would using this ray tracing tech significantly improve performance if we kept to roughly the same level of visual fidelity we are using now with fake rasterization reflection and shadow techniques?”

        • Marc-André Désilets

          No it would not increase performance, it would only give you better fidelity and better shadows/occlusion/reflection. Raytracing does not support some of the current tech like GPU particles.

      • Marc-André Désilets

        I would say, You don’t know what raytracing is.

        Light rebounds in realtime is still a slow process and requires a ai filter to show a descent result. The result you are getting is far from your raw result and let place to a lot of interpolation and for now the picture is generated in realtime with a very low number of rays compared to what you would get in the vfx/animation industry. Like you can’t even compare them, what we have in realtime with a rtx graphic card is something like 10 less precise than what you would get out of a preview render in a common raytrace renderer. This is why in the current implementation ray tracing is used for AO, shadows and reflection. The base Pass is still done on the classic pipeline. Ray tracing do not support gpu particules and other fx that are currently used in video game engine and that most game developper wouldn’t want to sacrifice.The rest of your comment don’t make any sense and/or is wrong.

      • silvaring

        Do you think thr standard gaming PC in a few years will be an apu that runs most current and past apps (e.g 2D video editing) with a dedicated GPU that specializes in Ray tracing and VR? Do you think Ray tracing will see an evolutionary scaling similar to Moores law?

  • ale bro

    I have two eyes which have slightly different views of a scene. does this mean that ray tracing in VR will have twice the load of ray tracing in pancake which only has one a single view?

    • kontis

      Yes, but there won’t be an additional multi-view overhead like in rasterizaiton. The increased load will be equivalent to the additional traced pixels, so just like for a larger monitor / higher resolution.

      In the future it will actually be running much better in VR than on monitors, thanks to eye tracking and the fact that raytracing is like 10x better for foveated rendering than rasterization.

      • ale bro

        thanks for explaining this :)

        so 4K per eye is exactly the same as running twin 4k monitors, if they have the same refresh rate?

        • Marc-André Désilets

          No, multiple technique are used like stereo instancing and some fx are not rendered in stereo at all (depending of the distance of the object) Depending of you’r 3d scene, if it’s weel optimized is more like 1.4 x 4k screen :P

      • mirak

        You could still use foveated rendering on monitors with a gaze tracker.
        Even on a 6 inch smartphone screen, when you focus to read text, the rest of the screen is blurry and could be rendered with lower fidelity.

  • Raphael

    And the irony is we won’t see any battlefield game in VR.

  • Lucidfeuer

    So basically they’re justifying the crazy price hike because of partially raytraced reflection we probably won’t get to see work out of the box. Clap clap…I’m just waiting for actual rendering benchmarks.

    • Jeppe

      What crazy price? It’s not more expensive than new GTX cards normally are

      • Lucidfeuer

        Right…

    • Jerald Doerr

      God.. I always hear it here… But I’m going to have to say it.. “It’s new tech!”

      plus the PCB is almost twice the size as the 1080ti … So it’s going to be faster than the last hands down…

      • Lucidfeuer

        It’s not “new” tech even if you’re one of those whose brain has been lobotomised by recent marketing propaganda. It’s actualised and update technology, not upgraded.

        The fact that you think there’s any incredible about a tech that is faster 4 years after the previous model shows this marketing propaganda (championed by Apple, 8 years ago) has worked.

        • Jerald Doerr

          Dude… Do you know anything about ray tracing or are you just talking out your flaming ass? Have you ever sat at your computer and rendered a frame and had to wait 2 min or 2 hours for a single image to render? Fuck I’ve had 6 hour renders just for one frame on 120 computers…

          So dude… Stfu cuz I know you don’t know shit about what you’re talking about.. If you don’t want one then don’t worry about spending your paper route money or your dads who obviously works for AMD…

          • JJ

            yupp thank you!

          • Lucidfeuer

            6 hours of rendering on 120 computers? It’s so funny how you’re not even credible…and to just realise it was a matter of you being an Nvidia fanboy cunt…

      • Marc-André Désilets

        You’r absolutely right! It will be incredibly fast. But that’s not because of the RTX chip it’s because the main rendering chip is a BEAST!

    • Marc-André Désilets

      It’s funny cause for now if they do a hardware realtime raytracing benchmark, there’s probably only going to have the nvidia rtx cards on it… For now, theres no other chip on the market that are specialized in realtime raytracing + ai filtering. At least not that I know.

  • Zerofool
  • Jerald Doerr

    Look people… RTX is more for developer/ artist but players will benefit too… But what’s funny is Nvidia is working there ass off to sell it to the customers..

    I can cheat almost anything Raytracing can do.. But it might take me a week… If I can just turn it on and be done with it looking right in 5 min… Guess what I want to use?

  • Marc-André Désilets

    By the way, by using planar reflection, most of these (“killer features”) are achievable on current Gen.

    • david vincent

      Planar reflections only work for planar surfaces; this rasterization approach will not allow for glossy surfaces.

      • Marc-André Désilets

        My point is that they are comparing it with SSR but there is many other ways to do realtime reflections that are better than ssr. Just another example dynamic reflection probe : https://youtu.be/lhELeLnynI8?t=8m4s

        • david vincent

          Yeah, this RTX tech is finally pretty anecdotal.
          “Foveated tracing” is what we need.

          • Marc-André Désilets

            Can’t wait to see the future on this tech too, we have already ordered our cards and I’m realy excited to see what we will achieve with them ;) “en passant, Bonjour le Québec!” We know each other or you’r spying on LinkedIn to see who your are talking to ?

          • david vincent

            I just clicked on your name out of curiosity because it sounded french and I saw “le journal de Montreal” :-) (I’m French myself)

  • mirak

    Raytracing is the future of vr because this is what you need to have plenoptic rendering, meaning the eye can naturally focus at the right depth instead of the fixed depth we have in current HMDs.
    As you can see in this video Nvidia is working on that since 4 our 5 years https://youtu.be/J28AvVBZWbg
    We won’t have that in a hmd soon but Nvidia is working on that.

    Accurate eye focus is important because, like parallax or stereoscopy or occlusion, it’s a tool our brain uses to evaluate the depth.

    This is important because it reduces eye strain,

    Also with a plenoptic hmd you would not even need prescription glasses the correction can be done by the software.