NVIDIA’s latest GPU is here and it offers a big performance bump, but what exactly does that power deliver the VR gaming enthusiast? We pit the new Nvidia GTX 1080 Ti against the GTX 1080 to see just how far each card can enhance VR image quality through supersampling.

It’s frightening the pace at which the GPU industry moves. Here we are, less than one year after Nvidia launched its brand new line of 10-series ‘Pascal’ architecture graphics cards with the GTX 1080, back with a new card which promises to not only outgun its predecessor by a significant margin, but on paper matches the performance of Nvidia’s flagship GPU, the ludicrously pricey and powerful Titan X.

SEE ALSO
Analysis: Latest VR Ready GPUs Benchmarked and Compared with NVIDIA 'FCAT VR'

Table of Contents


NVIDIA’s GTX 1080 Ti – the Titan X Killer?

The new GTX 1080 Ti is here and offers a step change in performance when compared with the last generation, Maxwell architecture GTX 980 Ti.

980ti-vs-1080ti-table

This is certainly impressive, and you can see why Nvidia are keen to emphasise the progress that’s been made since the 980 Ti’s launch in 2015. But the real story here is that this new card’s closest performance stable mate is the current generation $1,200+ ultra-enthusiast card, the Titan X. In fact, the GTX 1080 Ti is built around the same GP102 GPU used in Nvidia’s Titan X released last year. With 12 billion transistors, GP102 is “the most powerful GPU Nvidia has ever made for gaming.”

GeForce_GTX_1080_Ti_Block_Diagram_1488855502
1080 Ti block diagram shows the card’s underlying architecture

The GeForce GTX 1080 Ti ships with 3,584 CUDA Cores, 28 Streaming Multiprocessors (SMs), and runs at a base clock frequency of 1,480 MHz, while the GPU Boost clock speed is 1,582 MHz. And as we’ll discover, there’s quite a bit of headroom in both memory and core base clocks. The 1080Ti sports 11GB of GDDR5X VRAM, just 1GB shy of the Titan X, and that’s a spec shaving that you’re very unlikely to notice, even when gaming at 4k or supersampling at extreme levels. In other words, the 1080Ti just made the Titan X effectively obsolete.

Bear all of that in mind, and consider that the new GTX 1080 Ti shipped last week for $699, the same price as its GTX 1080 predecessor went on sale for just 10 months ago. It’s also launching at this price a mere 8 months after the 10-series Titan X, owners of which may justifiably feel their wallet wincing at their short lived performance supremacy.

Testing Methodology & ‘FCAT VR’

The world of cutting edge GPUs may move quickly, but one of the reasons why virtual reality remains fascinating is that it’s moving even faster. Last year’s GTX 1080 review opened with an apology of sorts, stating that as VR itself was in its infancy, we had no tools to record metrics at the level of empirical detail which standard PC gaming enthusiasts take for granted. As of this week, we’re allowed to publish benchmarks based on the newly released FCAT VR tool from Nvidia, a new frame analysis tool which records VR runtime data in detail and lets us peek under the hood at if and when VR rendering safety nets like Asynchronous Spacewarp and Asynchronous Timewarp/Reprojection are kicking in under load.

SEE ALSO
NVIDIA Announces 'FCAT VR' Frame Analysis Tool to Help Demystify VR Performance

As the 1080Ti is considered a high-end GPU for dedicated enthusiasts, we wanted to really get to grips with the benefits such extreme performance could provide VR gamers. Whilst current generation headset displays are limited in terms of overall pixel density (meaning a visible panel structure), one of the biggest immersion breakers are jaggies (aliasing) caused by a low target render resolution. We’ve therefore concentrated our VR benchmarking efforts to test the limits of the GTX 1080 and 1080Ti and their ability to supersample the image to extreme levels. Supersampling is a compute intensive way to reduce aliasing (the appearance of obvious pixels or stepping on a digital image) by first rendering at a much higher resolution and using that extra detail to down-sample to a lower resolution, but one of a much higher resultant quality. Supersampling is the easiest way outside of game-specific rendering options to improve image quality and immersion.

As man cannot live on VR gaming alone, we’ve also assembled a selection of visually sumptuous and computationally taxing games. each benchmarked with tests designed to highlight the raw grunt each card possesses.

gtx-1080ti-full-card-image-large

Overclocking

Although we’ve only had limited time with the 1080Ti thus far, we did manage to ascertain what we think is a stable (and fairly generous) overclock on our supplied founders edition unit. Pushing the core clock to +170Mhz above base with an additional +400Mhz bump for memory, we cautiously kept fan speed fixed at 80% with temperatures maxing out around the 80-85 degree mark. These numbers are provisional, but provide a healthy boost to performance and that’s with no additional cooling or voltage applied – and they proved stable. We’ve included overclocked results in some of the benchmark breakdowns. Interestingly – for those of you squeamish about damaging such a pricey piece of hardware – you actually only need to lift the cap on the card’s power and thermal throttling limits to realise some significant gains.


Testing Rig

exemplar-2We partnered with AVA Direct to create the Exemplar 2 Ultimate, our high-end VR hardware reference point against which we perform our tests and reviews. Exemplar 2 is designed to push virtual reality experiences above and beyond what’s possible with systems built to lesser recommended VR specifications.

Test PC Specifications:
SuperNOVA 850 G2 Modular Cables, 80 PLUS® Gold
MAXIMUS VIII GENE LGA 1151 Intel Z170 HDMI SATA 6Gb/s USB 3.1 USB 3.0 mATX Intel Motherboard
Core™ i7-6700K Quad core (4 Core) 4.0 – 4.20GHz TB, HD Graphics 530, LGA 1151, 8MB L3 Cache, DDR4-2133
ACX mITX CPU Cooler
16GB (2 x 8GB) HyperX Fury PC4-17000 DDR4 2133MHz CL14
500GB 850 EVO SSD, 3D V-NAND, 540/520 MB/s
1TB Barracuda®, SATA 6 Gb/s, 7200 RPM, 64MB cache
2 x 120mm Quiet Case Fan, 1500 RPM, 81.5 CFM, 23 dBA, White LED
Custom 20-Color LED Lighting w/ Remote
Windows 10 Pro 64-bit Edition

Continue to ‘Standard Gaming Benchmarks’ >>

1
2
3
Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Based in the UK, Paul has been immersed in interactive entertainment for the best part of 27 years and has followed advances in gaming with a passionate fervour. His obsession with graphical fidelity over the years has had him branded a ‘graphics whore’ (which he views as the highest compliment) more than once and he holds a particular candle for the dream of the ultimate immersive gaming experience. Having followed and been disappointed by the original VR explosion of the 90s, he then founded RiftVR.com to follow the new and exciting prospect of the rebirth of VR in products like the Oculus Rift. Paul joined forces with Ben to help build the new Road to VR in preparation for what he sees as VR’s coming of age over the next few years.
  • Narabel

    Not sure if I enjoyed your last comment as it implies it wasn’t a cynical decision for competition. It obviously was. Otherwise they would have released the 1080ti a month within the 1080 instead of a year. They’ve done it before like in the 600 series.

    Hope AMD pulls another Ryzen on the GPU side now.

    • JustNiz

      Ive seen this play out more times than I can count already. Its inevitably gonna go like this: Vega’s only solid wins against the 1080 will just be on the “performance per $$” graphs. It will generate more heat, noise and use more power than the 1080, and will just slightly beat 1080 on some benchmarks, but will do less well in real-world gaming. It will have the inevitable batch of new driver issues that AMD always has for at least the first few months, and It won’t be nearly as good as the 1080 for VR because tests/reviews will show it suffers with more micro-stutter, and AMD’s simultaneous re-projection won’t be nearly as well-implemented/supported as nVidia’s.
      Also even though every review website will show how a single Vega GPU won’t come even slightly close to 1080ti on performance, that still won’t stop the inevitable bunch of brainwashed AMD fanbois making ridiculous claims about how much better Vega is than everything else. And like every other ATI/AMD card since the dawn of time, for Linux users it will suck balls compared to nVidia.
      Later, near the end of the 980ti’s product life, AMD will come out with a whole-house heater/windtunnel that uses 2 Vega GPUs on a single card and an external watercooler and costs $800+ and will only slightly beat a 1080ti on outright performance. But that will cause nVidia to immediately release their new Volta GPU built with TSMCs new 12nm process, which will also be the first consumer card with an HBM2 stacked memory architecture, which will put an epic smackdown on AMD once more, and round we go again.

      • Nairobi

        That’s a pretty pessimistic comment. Particularly stating that their implementation of Async wont be as good. Which just came out a few weeks ago for AMD.

      • Javed Asghar

        Fanboy much… 80% of what you wrote is rubbish and far from facts. Ps. Vega RX is not even intending to beat 1080, its targeted at besting even the 1080 Ti, google a little. Some numbers are out. Be happy there is competition and Nvidia who loves to F$$$ guys who fork out the cash for Pascal X just to find a few months later a card at half the price from the same company even does better- why so, because AMD really told them stop your ripping off customers and bring it on. Nvidia long time customer here.

        • Michael Carroll

          Haha the only fanboy is you. He nailed that shit to the t, minus one thing. HMB2 stacked was in the vega and it sucked. 10 months before it happened.

          The rx vega sucks balls so bad the only people who bought it were fan boys, and retard miners. The 1080 ti blows them all out of the water, and the titan xp beats the 1080ti.

          No one in a sound mind would buy a rx vega. It doesnt even beat a 1070ti, and the 1070ti is 450 bucks. Rx Vega sucks on everything.

      • David

        So, AMD is the driving force behind NVIDIA’s innovation? I guess in that case, we’re better off if the two are fairly close and continue to push each other to new limits. Just imagine what the world would be without AMD! O_o.

    • psuedonymous

      “Otherwise they would have released the 1080ti a month within the 1080 instead of a year.”

      And large dies just pop out the fab on a new process with no defects and no high-margin HPC market to address, no? With no competition from AMD, Nvidia could wait as long as they wanted to stockpile binned GP102 dies that did not meet criteria for Teslas, Quadros, or even Titans. If anything, NOT waiting for concrete Vega details just means they’ve gotten bored of waiting and decided that actually putting out a card for people to buy was worth more than holding it back to gazzump AMD’s release.

      • Nairobi

        NVIDIA gets a different type of market from AMD. In the cheap GPU department they still rule over Nvidia. AMD has trouble catching up in the high end.

        • Michael Carroll

          There is 0 reason to buy a amd card, from cheap to max. There is a better nvida for cheaper. Thats why steam charts say over 80 percent of people use nvidia. Because amd are useless, unless you intentionally want to get worse fps.

  • NoobQuestion

    Which software did you guys use for overclocking?

    • MSI Afterburner. So this was a fairly crude setup (no power based profiles etc.). I’m convinced there’s some more headroom left in the card.

  • Mike

    “It’s frightening the pace at which the GPU industry moves”
    Not really though, if you look at the big picture. Add up all the yearly ~30% performance gains, and you get the equivalent of a new console generation every 5-7 years. A 30% jump seems like a big deal, but it’s not the night-and-day difference between, for example, PS2 and PS3 (something like a 8-to-10-times performance jump – so at least 800%).

    • The 980Ti has been superseded by the 1080Ti offering 100% more performance in under 2 years after its release. That’s incredibly rapid, and your console analogy works against you here as both Sony and Microsoft have recognised the expectant pace I speak of by releasing more powerful consoles within the lifespan of the other (PS4 Pro less than 3 years after it’s predecessor’s release) Scorpio on the way this year.

      Besides all of that, my statement doesn’t preclude other technology also evolving at an impressive rate.

    • Allen

      ps2 and ps3 both are 30fps machines…there’s very little difference between them in terms of actual ability…neither of them could come close to touching top shelf desktops that where released 3-5 years prior…the new ps is running mid-low range amd stuff from like 2010 if memory serves…

  • Skenzin

    My 1080 is already weak and feeble! I think I’ll sit out a gen, until Gen2 headsets come out. And to think last summer I was happily VRing on an unsupported Geforce 960.

    • Buddydudeguy

      It’s far from weak and feeble. I have a 1080+ a Rift. VR is great. 1.5-2.0 PPD ( super sampling) in all titles, no problem.

      • Globespy

        lol……I have what I consider now an old card (GTX1080 FTW OC’d at 2100Mhz) and i7 7700K.
        Try your little idea of maxing everything in Project Cars 2 for example. I’m running most items on low or medium to maintain 90FPS with a grid of 25 cars. Even the 1080ti is brought to it’s knees and has to live with most settings at medium, forget about high and ultra just isn’t happening. And Pcars2 was heavily optimized from the ground up for VR.
        GPU’s are not there yet, at least not for demanding titles.

        BTW – what titles are you claiming to max out every setting and run up to 2.0 SS so that I can provide you with a laundry list of data (with video backup) that proves your’e talking through your butt or you don’t realize that ASW has kicked in and your’e only seeing 45FPS.
        I know, I have the same hardware and all the knowledge to squeeze every last bit of performance from VR/GPU – there’s no magic dust my friend. I’m guessing your 2.0SS is on some title that isn’t demanding…maybe some demo from the Oculus store?
        Sorry to sound like a dick, but I’m tired of seeing kids jumping on and spouting this crap when the very manufacturer of the card and plethora of respectable hardware reviewers have shown that it can’t do these things.

        I agree with another poster that it will be closer to 5 years before we see full 4K being supported. Unless 2 things happen.
        1. VR HMD manufacturer’s get smart and have integrated processing power in the HMD to supplement the PC – not everyone can go drop $2K on a high end gaming rig.
        2. GPU companies release the damn tech they already have, or are forced to in order to support next gen HMD’s that require much more power.

        VR adoption is still somewhat niche, so until a vehicle for mass adoption arrives, it’s just business as usual for Nvidia/AMD with their annual updates.

        • Buddydudeguy

          ” some demo from the Oculus store”? haha…no. That’s why I said 1.5 TO 2.0. Depends on the game, but 1.5 SS is often doable. 1.3 on the most demanding ones. “kids jumping up and down”? I guess 40 is considered a kid. Look, I’m not even sure what you’re saying here, and I don’t really care.

  • xxTheGoDxx

    Very good test. One thing that I find missing though is VRAM usage to make sure the 1080 isn’t just limited by the bigger framebuffer at higher render scales, because 2017 / 18 VR titles will most likely still don’t use more than 8 GB VRAM at normal render scales.

  • VRdeluxe

    Conclusion is HMD vendors need to seriously pull their thumbs from their asses and release a real VR device with decient resolution. They have all the money, componentry and software to build a true 4k headset but its gaurenteed they will drip feed us slight improvements to make maxium profit.

    • Tony

      It can be done…the question is can it be done while keeping the cost of the HMD down for consumers?

      • kontis

        What can be done is *NOT* just a matter of money. What is currently available for consumers is pretty close to current technological limits for an HMD that does not cost several million dollars per unit.

        • Tony

          I disagree. What is currently available to consumers in terms of display features on larger panels for monitors is exceptionally better.

          Low latency, high refresh rate, high pixel density displays are available for premium monitors.

          However, implementing these on HMD size panels would make an already small and expensive market, even more expensive and even less attractive.

          Around the DK1 and Dk2 era the goal was making VR attractive (affordable) and they were aiming at the $300 USD. The $600 USD price for Rift kicked up a storm. Vive more so at about $200 more.

          The tech is there, but manufactures will only include these features if the majority of VR market will invest in them.

          They won’t develop a 4k HMD for the 5% of a small market who can afford them. The larger the market, the more you can invest (regardless of RND costs) and the greater the returns. VR is a risk. Make it affordable and (technically) acceptable, and you’ll build a market successfully.

          Edit* Back in the day I’ve modded my DK1 to look sharper and clearer than my DK2 – but at great cost. My DK2 was a better product overall. My CV1 resolution of 2160×1200 at 90Hz split over dual displays is not quite 4k/UHD.

    • usherjerksoffsonyfanboys

      go to agree with Tony said…..4k anything is expensive. An HMD market for one would be extremely small.

      Now are u talking 2k per eye, combined to be 4k….or 4k per eye, because the latter isn’t market feasible at all.

      • VRdeluxe

        2k per eye using 4k screens is all we need for now. Pimax have proved you can eliminate screen door effect from that alone. Unfortunately their single fixed screen, hdmi connection + amature software makes the overall experience terrible. The rift cv1 needs nothing more than higher res screens and software to upscale when needed

        • Konchu

          I would love to see a Pimax headset but hard to swallow without any real support.I know different tech can help with screen door too. For instance I find the screen door on PSVR a lower res screen less pronounced at least to me. I thing 2K per eye will be amazing but I suspect we may need 4k per eye or higher to reach the same clarity of a 4k tv not in VR.

    • D3stroyah

      well i wanted 4k hmd two years ago, i’m sitting back waiting. 1080ti could move that with ease

      • VRdeluxe

        Hopefully. The head of Nvidia confirmed it will do 4k Vr at the launch event. I expected the same after all the hype but Cv1 turned out to be a fizzer. Mine broke after just 6 months

      • Lance

        Your dreaming…1080ti will not run vr 4k very well…maybe a small hallway or room…but open area with trees…not a chance

    • kontis

      No one is having thumbs in asses. Current headsets are pretty close to the cutting-edge tech and what’s possible today for ANY money.

      There was only one *prototype sample* of an OLED screen with higher PPI than in current headsets shown by Samsung last year and it is still NOT being used even by Samsung and even in the flagship, expensive phones that sell in large quantities. Samsung claimed that *research(!) and development* of a very high resolution screen for VR headsets would require 5-10 billion dollars. This industry cannot afford that yet and the technology you want does not exist yet.

      You want alien technology, so maybe ask aliens.

      • OgreTactics

        Ah the usual “I’m an unperceptive loser who never produced anything neither do I know the matrices of cost management and actual tech and price available…therefor it must be alien technology”. I fucking hate hypocrites like you.

        • joemitz

          Jesus, calm your tits

          • rheddherring

            Won’t happen Joe. The pseudo intellectuals have to condescend, that’s what they do. They live in a fantasy geek world where they are Galactor of the Universe. But let them spin, not our monkeys, but it is quite a circus.

        • VRdeluxe

          Yes this guy is the generic fool on these forums who believes the only technology that exists today is something they read about in the news from one company nearly a year ago. There are multiple companies working on advanced micro display technology which we the consumer know nothing about. Both Oculus and HTC will already have advanced prototypes ready to go. The Crescent bay CV1 was designed more than 2 years ago when Oculus was worth nothing

      • VRdeluxe

        You are the classic poster who believes the only tech available is whats currently on the shelves or somthing you have read in the news. Oculus and HTC already have an advanced version of the CV1. It’s all just about market strategy and making top dollar

        • Stacey Bright

          The problem isn’t the availability of tech, but the cost associated. Its bad enough these things already cost $600, but getting anywhere near 4K 90fps means spending $500-$700 on a graphics card as well. One of two things needs to happen before its really feasible to mass produce higher res HMDs. Either 4K capable GPUs need to be sub $400, or VR mGPU support has to become standard and universal. We are probably 2 GPU generations off for the former, and the latter would likely be made easier by transitioning to the newer APIs.

    • OgreTactics

      But how much resolution is enough resolution. A well chosen 4K Amoled certainly rids HMDs of either the screen-door or led-curtain effect, but then…? I don’t think resolution is the challenge right now, rather it’s underwelmingly low FOV that’s a problem to be solved.

    • Claus Sølvsten

      if I was given option between current vive resolution and full 4k I would go full 4k even if it would cost me a little more in a better panel. steam allready supersample to 1,4 times vive resolution as far as I can read.

      • VRdeluxe

        Exactly. I think its unfortunate we have to supersample when we could be running native resolution. Games look barely ok with everything running at maxium and VR vido looks awfull in low resolution. Thats where most of the investment is going into VR so it makes sense for them to release the CV2 as soon as possible

    • Jerald Doerr

      Lol… VRdeluxe… so go out and make one for us! If it don’t cost to much I’ll take one…

    • Jason Lovegren

      I figured I would chime in for the first time. Everyone is saying make HMD 4k which makes no sense at this point. The software outweighs the hardware. VR in very demanding, and 4K would cause people to simply puke. They’re trying to make VR mainstream and 4k in just too demanding at this point. Hell standard 4k gaming still isn’t here even with this card. Granted a lot of games but not all of them.

    • Lance

      And what video card will push 4k in VR? None on the market….cards and VR are going hand in hand.. 5 years from now VR in 4k will be common…

  • D3stroyah

    holy shie 3x and stil works without dropping off asw (never under 45fps). This will allow 4k HMDs super easy.

  • OgreTactics

    I didn’t find the 1080ti to significantly change anything from real-time processing or rendering to VR/Interactive/simulation performances.

    In fact I’m both impressed and disappointed that Nvidia still is not following Intel miniaturisation process and still able to juice more “Terraflopz” because the size, waste, consumption and heat of these fat bulk GPUs are what is belating the whole computing industry of which an over-powered (i7k, 16gb DDR4, PCIe, nvme, nano-ITX, wifiad chip) computer fits in a tiny box almost the size of a single fat GTX.

    • Allen

      your gotta make it smaller and use less power shit is whats really hampering the performance market..idc if the next gpu is 100ft x 100ft and uses 10kw performance is what i want

  • philb

    Sony makes a crazy expensive 0.7-inch (18.0mm diagonal) OLED panel (1280 x 720). The thing has 2098PPI OLED display. No your not going to find it in HTC VIVE panel size but goes to show you crazy DPI is possible it comes down to cost. It’s from a viewer from a 40,000 dollar camera or something along these lines. I am sure the viewer is likely 5 grand so the panel is likely 2 grand. I am sure as technology improves the prices will come down for these panels.

    http://www.androidauthority.com/worlds-highest-ppi-display-2098ppi-oled-display-sony-254182/

  • Laugh Out Loud!

    nvidia sucks

  • Laugh Out Loud!

    like/literally…

  • Laugh Out Loud!

    i made my own graphics card and its 2 time faster than the nvidia titan x

  • Laugh Out Loud!

    i mean nvidia is goood but not in all types of graphical preformances