While Microsoft’s ‘Mixed Reality’ VR headsets are due to launch with some impressively low minimum hardware requirements that will be met by integrated graphics, the big players presently in the market have a much higher ‘VR Ready’ bar which involves a beefy GPU for high-end gaming applications. Intel says they expect their integrated CPU graphics will eventually achieve the performance required to meet that high-end bar, though even their new ultra high-end Core i9 isn’t there yet.

‘VR Ready’ is a colloquial term that refers primarily to the recommended hardware specifications for VR, supplied by both Oculus and HTC. While their recommendations cover a range of facets, arguably the most important part is the need for a powerful GPU, a big (and expensive) graphics processing component.

Both companies agree on the same set of GPUs that they consider good enough to meet their recommended specifications: an AMD Radeon R9 290 / Nvidia GTX 970 or higher; mid-range GPUs that can handle the high-powered rendering tasks required of VR headsets.

SEE ALSO
Intel's Next Project Alloy Prototype to Get Enhanced CV Capabilities, Wider FoV, More Powerful Processor

In the early days of consumer computing, computers only had CPUs (the main logic processor of the computer), but as 3D graphics came to the fore, the GPU became a more common addition to the computer’s hardware, acting as a specialized processor for the calculations needed to render 3D graphics.

For people not into gaming or 3D workflows, however, the GPU isn’t entirely necessary as CPUs can do enough graphics processing to get by. If you hear anyone talking about ‘integrated’ graphics, this is what they mean: using the CPU’s own built-in graphics processing capabilities rather than a separate (AKA ‘discrete’) GPU.

In the last 15 years however, the GPU has become an increasingly important part of the modern computer due to an increase in the demand for graphical processing brought on by the popularity of 3D content like gaming and other heavy graphics tasks that can be accelerated with a GPU.

That’s been a fire under the foot of CPU-maker Intel who has, in the last 10 years or so, decided that their integrated CPU graphics need to step up their game in order to keep up with a world increasing in demand for graphics processing power.

A look at the layout of Intel’s 7th-generation Core i7 die; a significant part of the footprint is the Intel HD Graphics core. | Photo courtesy Intel

So the company introduced ‘Intel HD Graphics’, which would add more graphics processing performance to the company’s CPUs. Since the introduction of HD Graphics, Intel’s integrated CPU graphics have gone (as far as gaming is concerned) from pretty much a joke, to being powerful enough for modern low-end gaming (in fact, about 16% of Steam users at present use Intel’s integrated graphics). For consumers who want to do just some gaming and other light graphical work, that’s a big plus because it means they can eschew the cost and bulk of a dedicated GPU.

And while Intel has actually made pretty impressive strides in their integrated CPU graphics, the current level of performance—even in the new Intel Core i9, an ultra high-end CPU priced at $2,000—is still far from ‘VR Ready’.

Intel is already branding its processors around VR, but they still rely on a mid-range GPU for the graphics needed to meet the VR Ready spec. | Photo courtesy Intel

But Intel plans to continue growing the graphics processing capabilities of their CPUs, and one day, expects to meet the same VR Ready spec that Oculus, HTC, and others recommend for high-end VR gaming.

Frank Soqui, Intel’s General Manager of Virtual Reality and Gaming, confirmed as much speaking with Road to VR this week.

A look into the heart of Intel’s latest Core i9 X-series processor | Photo courtesy Intel

“Integrated graphics will support high-end VR… and not just mainstream [like the Windows Mixed Reality spec]. Our goal is to continue to push performance on CPU and graphics [to achieve] the same high-end content you’d see on Oculus and HTC.”

It’s a bold claim—if nothing else, simply due to the available area occupied by a CPU compared to a GPU—but it reinforces Intel’s commitment to not just keep up with the dedicated GPU, but to close the graphics performance gap between integrated graphics and dedicated GPUs.

SEE ALSO
Analysis: An Estimated 58 Million Steam Users Now Have VR Ready Graphics Cards

Soqui didn’t offer any sort of timeline for when this might happen, and the way he talked about it made me feel like it wouldn’t be in the near future. But if Intel succeeds in this venture it would surely be a major boon for the VR industry because the majority of modern computers in the world lack dedicated GPUs and instead rely on integrated graphics. Integrated graphics capable of high-end VR would mean a much larger addressable market for VR hardware and a cheaper initial buy-in too.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Zobeid

    Hmm… But if you’re going to have a graphics card in your system, then isn’t the integrated graphics just a waste of die space? Can you get CPUs without the graphics? (Would that be the Xeon series??)

    • I guess you could run even more monitors by using the iGPU + Dedicated GPU. As to which has iGPU even Xeon can have it as do most desktop cpu’s. e.g. http://ark.intel.com/products/family/88210/Intel-Xeon-Processor-E3-v5-Family#@Server

    • Get Schwifty!

      It is if you use a dedicated card, which certainly do for VR. That being said, many laptops (non-gaming or high-end) use just the integrated graphics alone, so it *does* have a place. I believe you can get some PU lines like the Xeon without it, but it has one serious advantage or laptops which is the integrated graphics use far, far less power so even in that application it makes good sense.

      • Darshan

        On die GPU are given as bonus, their purpose is if any time you Graphic Card dies till you get new one your system do not miss out its core basic functions and your PCs keep on working for your mails, surfing and other business. (ALSO to enable users to use their monitor who purchase system for non gaming purpose still want normal video capabilities like playing movies do some light photo editing or coral draw Photoshop etc still determined to not buy discreet GPU). so its good to have.

        Previously mother board manufacturers use to add this genral purpose video-chip on motherboard as Part of Northbridge (Northbridge is part of motherboard that handles communication with CPU,GPU,RAM) The north bridge typically handles communications among the cpu, in some cases ram, and pci express (or agp) video cards, and the south bridge(part of motherboard that handles IO).

        Some northbridges (OLD) also contain integrated video controllers, also known as a Graphics and Memory Controller Hub
        (GMCH) in Intel systems. Because different processors and RAM require different signaling, a given northbridge will typically work with only one or two classes of CPUs and generally only one type of RAM. This led to evolution of SoC (system on Chip) CPUs where multi type of single generation of CPUs *having identical number of legs* can be affixed on single motherboard (thus we get motherboard wihich can support i3 i5 and i7 without making changes else it would need new motherboard for different processors, that bring us to today’s processors.

        There were experiments in past to hybrid the processing of on cpu die gpu and graphics card to increase the performance but it never worked out due to complexity and minimal gain against increasing such complexity.

        So if you don’t want to use on die GPU just plain ignore its existence.Its certainly not waste of die space when you consider complete user base.

  • Raphael

    Good idea to boost cpGPU performance to the point where it’s usable. PC platform is too expensive for many gamers especially those who want to enter VR. On the downside… Intel don’t price their processors to make the platform more affordable. How much does a PS4 cost? Well an intel processor can cost more than a console.

    • Lucidfeuer

      doesn’t make sense.

  • Serg

    Intel a bunch of greedy capitalists. At this price point I’d gladly consider threadripper.

    • Get Schwifty!

      Heh I think being capitalist pretty much automatically implies a degree of greed… no need for the extra delineation :)

    • Lucidfeuer

      Or an AMD Ryzen. Also nothing do with capitalism but extreme financial liberalism, which we live under now and is suffocating the whole world, countries, companies and people until they’ve stolen all the monetary value from people who actual create it and after the economic crisis there’s finally a krash. But that’s another topic.

    • evo_9

      This isn’t anything new. Everyone nails the early adopters, this is how they cover their R&D. Do you think Fabs are cheap? In 6 months they’ll be half the price, a year from now half that. Or whatever Moore’s Law predicts… etc.

      • Lucidfeuer

        Moore’s law is pretty dead. You force-pushing toward below 10nm chips which point towards sub-nanometer dies…at which point will probably start transitioning towards quantum-computers…which means CPUs innovation as a die-shrink is pretty much dying.

        • Dominic Lacroix

          So much wrong with your comment. Fab processes will stop at 3-5nm because further shrinks will not be economically viable. After that, stacking (3d) is the next step. As for quantum computing, it cannot do what classical computing does because the goal is not the same. Read up on quantum computing before talking about it. They will never overlap

          • Lucidfeuer

            Yeah right, die shrink has been stuck at 14nm for 4 years now, it’ll barely switch to 12/10nm next year while the whole marketing calendar has slid into smaller incrementations based on tierce functionalities and optimisation, which basically means we might be stuck with 10nm for even longer, and there is no current plan to go smaller for consumer.

            As for quantum computers, what the fuck do you think they’re for, they are firms investing in finance, science, AI, simulation, 3D applications right now. Or we will stay with the same CPU/GPU system for another century…yeah right.

          • Dominic Lacroix

            Here’s an article about quantum computing and why it cannot replace traditional computing:
            https://arstechnica.com/science/2010/01/a-tale-of-two-qubits-how-quantum-computers-work/

            and here’s an article on IBM’s new proof of concept 5nm chips:
            https://arstechnica.com/gadgets/2017/06/ibm-5nm-chip/

            If you still want to argue then I’m out since facts have no effect on you (I’ve seen you argue with many people here based solely on your opinion)

          • Lucidfeuer

            You’re just being inexact, even per your own sources. IBM are not first one to have made a sub-10nm die size chip, but how long do you think it’ll take to get to consumer market? According to Moore’s law it should already be here given than the 14 nm chips are 4 years old…this law is dead, it’ll take a lot of time.

            As for quantum computer, IT WILL of course be a consumer technology in 15-20 years. It’s a matter of isolating the chip operations from the user operations not to disturb the system, and for exemple being able to have multi-qubit operation encryption so that some of it is waste in the case of instabilities but the information is still being conserved and operated. There’s a even the solution of a strictly cloud-based quantum computing network, which means it’s operation can not be altered by user operations.

  • vvecchi

    The i9 doesn’t have integrated graphics, the title makes no sense.

    • evo_9

      And thank God it doesn’t! Also I was wondering when I’d have a valid option to upgrade my i7 boxes. Of course that 2k price isn’t going to work but I’m a patient guy…

      • vvecchi

        there is an i7 and even an i5 in the new x299 platform, none of them have integrated graphics, although I would go with an i7 in the z207 and ignore the integrated graphics

    • Tadd Seiff

      “even in the new Intel Core i9, an ultra high-end CPU priced at $2,000—is still far from ‘VR Ready’.” This would seem to suggest the author honestly believes that it does have integrated graphics.

      I’m being seriously misled by someone here…

      And if they don’t, as you claim, have integrated graphics, then indeed, the i9 “is still far from VR Ready”, much much farther than they realize.

    • benz145

      This is partly the point, though the article could have made it more clear. Integrated graphics aren’t good enough for high-end VR content; by not having any integrated graphics Intel is offering up the expectation that the CPU will rely entirely on an external GPU, hence the i9 “Isn’t ‘VR Ready’ on Its Own” (because it needs to be paired with a VR Ready GPU).

      Sorry for the confusion, I was trying to straddle the important line between saying that the i9 “Is not VR Ready” because technically is a VR Ready CPU given that it meets the recommended requirements for CPU power.

  • Foreign Devil

    I’d much rather they don’t add cost and precious room to the CPU dedicated to graphics which will become redundant in 90% of the cases when people buy a dedicated GPU. Now I would be interested in seeing Game engines able to scale to the multiple cores now being offered. (that 2K intel i9 has 18 cores available.) It certainly can have a place in laptops and mobile. . but NOT in high end desktops (which more and more is the only kind of desktops people own.)

  • iThinkMyCatIsAFlea
  • Darshan

    “but to close the graphics performance gap between integrated graphics and dedicated GPUs.” Will not likely to happen.. reason is more related to money then technology. As it would certainly dent the business of NVIDIA and AMD RADEON., i am much more interested to see VR READY $200 Graphics Card….

    Still such advances may enable netbooks to support games like eve valkyrie.. but it certainly won’t be cheaper if SoC CPU only going to cost $2000, How such SoC CPU will handle thermal requirements in netbook housing and how long netbook battery will last at this usage is another elephant in room question.

  • Peter Hansen

    Oculus specs went down a bit lately:

    https://support.oculus.com/help/oculus/170128916778795

    • benz145

      A good point; this article carefully talks about “recommended specifications,” from Oculus and HTC (which are essentially identical), and not “minimum specifications,” (for which Oculus has a separate designation).