Exclusive: Former Oculus VP of Engineering Demonstrates Long Range VR Tracking System


Tracking is the foundation of great virtual reality. Knowing the position and orientation of the user’s head is essential to being able to move the virtual world convincingly around them as they move through it. Oculus and Valve/HTC have the leading tracking systems (dubbed ‘Constellation’ and ‘Lighthouse’, respectively), but soon a new entrant could join the VR tracking arena.

Into the Workshop

It’s an unusually rainy day in the hills of Livermore, CA, as I stroll down a row of small connected office buildings. Door after door with company logos prominently featured in the windows, but it isn’t until I find the door identified with only a small oval sticker that I stop. Collectors hoarding VR memorabilia 50 years from now might recognize the sticker—adorned with the letters ‘VR’ in the center and tiny ‘Oculus’ text at bottom—as one given out by the company in its earliest days. Fittingly, behind the door is the personal workshop of one of company’s earliest employees.

Jack McCauley served as Oculus’ VP of Engineering up through the Rift DK2. He was among four employees featured in the company’s 2012 Kickstarter video (not counting John Carmack and Michael Abrash, who would both join years later). Much of the footage from the Kickstarter video was actually shot in McCauley’s workshop where I am now standing.

By some definitions (including his) McCauley is a co-founder of Oculus due to his contributions and status as one of the earliest hires, though as far as Oculus itself is concerned, the company represents Palmer Luckey as the one and only “Founder” (with a capital F). As a seasoned design engineer who played an instrumental role in the creation of the Guitar Hero peripherals, McCauley brought hardware design experience to the young company, which was about to embark on designing, manufacturing, and delivering thousands of Rift DK1 headsets to eager backers of the crowdfunding campaign.

New Features Coming in VisionOS 2, Including What Apple Didn't Show

mts jack mccauley vr tracking system laser (6)

McCauley takes me through a small lobby space, which is adorned with Guitar Hero and Oculus related placards of recognition, into what he calls ‘McCauley Labs.’ “Lab,” I suppose, because what goes on inside is not just crafting, but much experimentation, and “McCauley Labs,” (in the form of a business title) perhaps because he employs several staff members who have complementary expertises, and explore projects he doesn’t quite have time for.

The workshop is a veritable wonderland for the hacker/maker type, littered with industrial-grade equipment that McCauley uses for work and hobbies alike. As I make my way to the back of the space, I see that the workshop’s walls are covered with graffiti-style art with characters and logos from video games, including a prominent depiction of a Lola T70 sports car.

McCauley positions himself at the edge of a large open space near a garage door in the back of the workshop and asks me to step aside. As I clear the area, a car elevator—the kind you would find in a parking garage—begins to lower from overhead. McCauley had the lift installed so that he could store his vehicles on the second floor of his office. As the lift reaches the ground, I see the project that’s currently occupying much of his time: a half-finished Lola T70, the same one from the wall. McCauley is building his own, mostly from the ground up, and he plans to race it when he’s done. He points out that the chassis is the same geometry as the original, but this one is TIG welded instead of riveted.

It’s clear at this point that McCauley is very hardware oriented, but a tinge of tech begins to shine through as he explains his plan to build the steering wheel to accommodate an Android tablet which will talk to the car’s engine via Bluetooth, displaying all its vitals in a central location. When I ask if he knows yet which app he’ll use for that purpose, he tells me he plans to write his own.

The Secret to 'Beat Saber’s' Fun Isn’t What You Think – Inside XR Design

What I Came to See

At this point McCauley returns the lift to the second floor and we head back to the middle of the workshop. Amidst huge tool boxes and complex machinery is a small 3D printed white enclosure attached to a wheeled worktray by a GoPro mount. This is what I came to see.


The face of the enclosure is about the surface area of a square box of tissues, but only about a quarter as deep. The back is seemingly missing a rear plate, which gives a clear view of an array of circuit boards and wires. Like Valve’s Lighthouse system, this is a laser-based tracking system. But unlike Lighthouse—which sweeps the room with lasers regardless of what’s in the area—this one actively seeks its intended target.

The MEMS Tracking System (or MTS, which I’m going to call it for ease of use) shoots its laser in a unique way compared to Lighthouse. Lighthouse uses lenses to stretch the laser’s landing area from a point into a line, then it sweeps those lines around a space by mounting the lenses on precisely spinning motors. MTS on the other hand, uses a tiny mirror (which tilts but doesn’t spin) to point the laser in any singular direction.

mts jack mccauley vr tracking system laser (8)

When initialized, MTS begins scanning a cone area in front of it in a grid-like pattern to search for its target. The target it’s looking for is a point of high reflectivity, measured by the amount of laser light that’s returning to the origin point. Of course, the laser could stop on any slightly reflective surface, so you need to make sure that the target is much more reflective than anything else in the environment, then set a minimum floor for detection, such that the laser continues to scan until a sufficiently bright reflection is found. A retroreflective marker, placed on a headset or motion controllers, serves as that sufficiently bright object, ensuring that much of the laser’s energy is reflected back to the receiver.

Fantasy Blacksmithing Game 'BlackForge' Releases on Quest & PC VR Today, New Trailer Here
mts jack mccauley vr tracking system laser (10)
The MTS laser fixated on a static marker attached to a Gear VR headset.

For a proof of concept, MTS is impressive. When McCauley first demonstrated the system, a small spherical marker attached to a Gear VR headset was sitting about 5 feet away. He turned on the MTS basestation and I watched as the laser slowly indexed the scene looking for its target. It started, quite logically, at the top left, and ran horizontally until reaching its right-most limit, then returned to the left, dropped down slightly, and continued from there, line-by-line. When it reached the marker on the headset, it stopped. The initial scan took 4 or 5 seconds. So at this point I figured MTS was a neat concept, but there was still much work to be done to reduce 5 second cycle down to mere milliseconds so that the tracking would be fast enough for practical use.

It wasn’t until McCauley picked up the headset and started moving it around, with the laser continuously fixed on the marker, that I realized the initial scan was probably slowed down for human benefit, and that MTS was already capable of high speed tracking.

Once the marker is found, the mirror continuously aims the laser at the object as it moves, constantly seeking the point of most intense reflectivity. Assuming this can be achieved robustly, tracking the object’s position on an XY coordinate plane is as easy as reading the angle of the mirrors that are aiming the laser. With one additional marker on the tracked object, or an additional MTS basestation, you now have all the angles needed to triangulate the object’s XYZ position.

Continue Reading on Page 2…


This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • psuedonymous

    This exact same technique was first implemented over a decade ago: http://www.k2.t.u-tokyo.ac.jp/fusion/LaserActiveTracking/

    • mellott124

      Wow, really impressed by this. There are several papers on it at the end of the linked page as well.

    • benz145

      From the article:

      “McCauley is clear to point out that much of what comprises MTS are off the shelf parts and algorithms pioneered for other purposes.”

      “Why nobody [applied this tech] for VR/MOCAP, I do not know. Perhaps nobody thought of it recently but it’s 16 years old and very established,” [McCauley said]

    • Rob B

      Not sure if this is the same group, but just found this with MEMs mirrors for 3d tracking in 2009:


  • Kevin White

    The obvious fix is to have the cameras inside the headset.

    • Hamish Pain

      Except then whatever the cameras on the headset are looking at will get smaller the further away you get, right? And then you have to get the data back to the computer through the headset cables

    • benz145

      We can put cameras on headsets. That’s not the thing preventing VR-capable inside-out tracking.

  • Bryan Ischo

    I fail to see why this is better than lighthouse. The lighthouse units themselves are probably just as cheap to build as this unit, since they have little more than synchronization logic internally and the sweeping laser. The sensors on the Vive headset add cost but they’re just small diodes as far as I am aware and probably not that expensive. Also they can be added to additional devices and track at the same time. How can this single laser track both my head and my hands? Can’t do it.

    I like that people are looking for better solutions but … just make lighthouse cheaper via volume production and refinement. There’s your tracking solution, now go focus on something else. Like, finding a way to eliminate the display cable. Or improving the field of view. Or improving the resolution. Those things are desperately needed. Better tracking than lighthouse, not so much.

    • Tyler Cook

      If this works like the sony laser pico projector, then it could track multiple objects. The way it works is like a CRT screen. It paints an image by sweeping a laser using the MEMS in a rectangle pattern. If there were two points to track, it would get both as it sweeped.

      It is very similar to lighthouse, except it is sweeping in a more precisely controlled area.

      But, I do agree with you in that it doesn’t really need to be that precise. Lighthouse floods the area, which is exactly what you want anyway, since you can be anywhere inside the zone.

      The only big benefit I would see is if, for some reason, sweeping a laser in this fashion is somehow more efficient or allows for greater distance in tracking. Then, you can have bigger rooms.

      • Bryan Ischo

        Good points; however, you can have bigger rooms with more lighthouse units and some synchronization logic and/or software to stitch the swept areas together.

        Also, I’m pretty sure that you could track 500 sensors using lighthouse if you had enough sensors. You can just keep adding sensors indefinitely, they can all share the same tracking signal from the lighthouse units. I doubt that the solution described in this article could go beyond a few tracked points, even if it used the sony laser pico projector technique you described.

        Also there’s the fact that the technique described in this article requires a reflection of laser light back to the unit; there is room for lots of error there.

        • Jack McCauley

          I have two systems, a vector based one like the film, and a raster scanned one like the pico projector. Either one will work about as well as the other.

          • Bryan Ischo

            Thank you for replying. My goal is in no way to discourage your work, if you can make better/cheaper tracking, by all means do!

          • Eric B

            Hi Jack, are you still working on this tracking ? I think it’s fantastic and it would lead to mass adoption of Virtual reality .

      • Jack McCauley

        Exactly. I’ve got one of those to experiment with, an oscillating mirror.

      • Sven Viking

        As presented here, though, each complete sweep is taking seconds rather than milliseconds.

    • David Mulder

      The big advantage seems to me to be that the cost-per-tracked-item will be FAR lower. So if you would want to do full body tracking then this is a solution that might do the trick if it’s fast enough, whilst Lighthouse is simply impossible and Camera based tracking unlikely. The disadvantage of this system is that it has – like Oculus’ camera tracking – a far bigger occlusion problem, so you might end up having to put a fair number of these stations in your room for reliable tracking of more than a headset.

    • Jack McCauley
      • Rob B

        Have you considered placing one mirror on the tracked object (headset), and searching for a fixed lit up marker? That way multiple objects may simultaneously track themselves without interference. (ie, they all try to point to the same bright fixed object).

    • Jack McCauley

      A couple of things, the laser energy is spread over an ellipse and with mine it’s a point source so the range is much longer. What do you think four motors cost??

      • Ryan

        I’m actually very surprised Vive came in as cheap as it did, given the cost of motors and lasers. Micro fabricated mirrors should be cheaper, but do they have the tip/tilt range to track objects close to the mirror?

        • Jack H

          I haven’t seen MEMS raster mirrors with FoV better than about 14 deg.

      • Hamish Pain

        @BryanIscho:disqus Advantages as far as I can see them:
        Longer range (Lighthouse needs an LED flash), automatic visual acknowledgement of tracking (a blessing for devs like me, though that will probably change to an IR laser, right?), far higher update frequency (lighthouse needs three phases of light emission, so there can be some drift in the meantime+need for IMU).
        Harder to track multiple regions (maybe? If the mirror’s steerability has a great response time and low overshoot [MEMS!], then it could probably be adapted to continuous scanning + tracking multiple items at reduced update frequency), still needs to be hooked up to a computer or the tracked object needs a light-sensor for information transmission, reliability of MEMS steering vs. motors (MEMS may be better due to low-mass and no friction issues), doesn’t track orientation currently or depth (though I can think of a few ways to do so with what I’m assuming to be the actual tech behind it. FPGA, yeah?)

        @jack_mccauley:disqus Love this kind of high-speed tracking. Have you considered using retroreflective materials for the marker? That could increase range even further by reducing laser-spread on impact. It could also allow a second laser-tracker unit to target each marker by reducing light-interference without having to alternately-pulse the lasers. I’m assuming the visual sensor, if not a camera, is an opto-diode under a lens for this, as spatially disparate laser light would still interfere with it.

        Pretty awesome laser steering, MEMS has really sped things up. I remember making a spinning mirror based laser tracer, what I wouldn’t have given for a steerable mirror!

        • benz145

          To answer a small part of this, some of the tracked objects in the video were retroreflective markers. Jack showed me a few different things being tracked.

    • Andrew Jakobs

      One reason this is better, it has no actual physical moving parts like the motors for spinning the lens on the lighthouse (which makes noise and will wear out after a while, at this point we have no idea how long the lighthouse basestations will work)..
      And the mirror is just like a DLP chip, so multiple mirrors can track multiple object, BUT you’re right, in that regard the lighthouse system is much simpler as you just add sensors on the stuf you want tracked and don’t need extra lasers for that..

    • Chip Weinberger

      Another limitation to lighthouse is the size of the objects it can track.

      Look at the size of the ‘halo’ on the vive controllers. Thats about the limit for good tracking.

      This coupd possibly track smaller items. It is also cheaper to make a tracked object. You could put these reflective markers on lots of objects. Without the need for electronics in them. Put a marker on whatever you want and have high fidelity tracking on it.

  • VR Geek

    Sounds like there may have been a few egos battling over which way to do the tracking at Oculus. Not sure Oculus went the right way with their camera based tracking. We will see in the final product, but it will need to be WAY better than the DK2 which would require IMO something much greater than the 752×480 camera they shipped with it. Based on this article and my own personal experience with the DK2, I suspect even 1080×1920 would not give enough resolution to track when back 10 feet. Maybe 4k plus would, but then there are larger bandwidth demands and more painfully, massive computational efforts required by the host PC. My gut is telling me that once the dust settles in May or June, how big of an issue, or not, the constellation tracking system is will become very apparent. I sense Oculus is in trouble here, especially after using the ViVe. I hope not as they really have done sooooo much to get VR off the ground.

    • DJ

      Sometime after the DK2 was released, Palmer Luckey explained that they went with the camera tracking system (now known as Constellation) after they’ve tried everything that was viable at the time. Apparently they didn’t try just a few, they tried dozens of technologies. Many tracking systems are better than Constellation, but they’re either too expensive, or locked by patents and unattainable, or have problems that only become apparent in application. The one technology that matched all of their criteria for price, availability, and capability better than any other that they tried was Constellation. It was a basic engineering decision.

      It’s easy to watch at all of these impressive tech demos and say, “That’s the solution to everything!” But, they rarely actually are solutions to your very specific application.

      I think that Constellation probably won’t survive more than a couple iterations of the Oculus Rift. Better systems are coming down the pipeline, you’d be ignorant to think that Oculus isn’t working with them to determine their viability. And new ideas are being invented that might even supplant those in a few years more. It’s a very volatile, and exciting, area of research at the moment.

      • VR Geek

        I very much agree with all you comments. That said, Valve is about to own Oculus pretty hard with Lighthouse. You cannot blame Oculus as they surely tried their hardest. It is just interesting that even with tons of money, and top talent that they are coming to market with what looks like the inferior tracking system. I am sure someone over there is loosing sleep. I hope constellation is better than previous demos when the CV1 arrives and that they quickly address if not for CV2.

        • Rob B

          From my understanding, the resolution of the camera isn’t as limiting as you describe. Its still used in conjunction with the IMU unit, and only used to correct larger scale drift.

          • VR Geek

            I can only speak to the DK2 camera myself which also uses the IMU, but it has never been that solid. We will have to wait to try the CV1. Lighthouse was incredible when I tried it extensively last year. Super solid.

          • Rob B

            Here’s the comments from OKreylos (doc-ok), that describes the process if you’re interested:


          • VR Geek

            Interesting. Thanks

          • Guest A

            It’s true that the camera isn’t as limiting as it sounds, but of the two (camera and IMU) the camera is the limiting factor. And like you said, they have to be used in conjunction. IMU is very fast but not accurate. The accuracy degrades overtime (i.e., drifting). How many seconds before it no longer tolerable as 1-1 depends on the specifics of the IMU. You need the camera to give you that accuracy. The higher the resolution the camera is, the further the range it can give you that accuracy data. However this means if you want longer range you need higher resolution and this doesn’t scale well.

  • Foreign Devil

    I keep thinking about the military applications of a laser that can lock in perfectly on a rapidly moving target. . .

  • Po Tato

    Celebrities gonna freak out when their fans track their each and every move with this kind of device

  • TrevorAGreen

    What I’m curious about is hybrid tracking systems that are domain specific. So If I have a lighthouse system I can track the vive and the controllers. And any other lighthouse enabled device. But what if I want to bring in something else that is tracked in the same space? Say I have a coffee mug. And I want to see that. That might be better tracked by a camera than enabling it for lighthouse. Or maybe I buy a item that is specifically designed to match a certain tracking approach. So it has some sensor and the object included, but it still allows it to appear in the same 3d space. Maybe that is a lower resolution of tracking. Maybe something even higher. Something specific that would be a cool tactile experience would be foam balls. The games that you could apply that too would be almost endless. But they probably wouldn’t be appropriate for a hard body powered solution like lighthouse. They would need to be camera tracked, or some other system. We are getting the vive and the controllers. Now we need to ramp up and start creating other tactile experiences.

  • Fadelis01

    The main issue I would see with this system is the fact that you need to track and differentiate multiple objects in the play space. I do see the need to move as much of the processing burden from the equipment in the players hands/head. That is the one benefit that truly remains with the CV1. I have high hopes for this technology, and this guy is on the right track IMHO.

    • Fadelis01

      Hey! Brainwave moment… what if the mirrors and lazer detectors took advantage of polarization? This would allow your scanners to scan across a mirror with a custom “polarization pattern” and read orientation and even an object identification key of some sort. I’m imagining a mirror “strip” that would have multiple polarization regions. As the mirror was scanned, a serial string of information could be created in the reflection as the Lazer reflection passed across the mirror. This could verify that the mirror is the “right” reflective surface for that scanner as well.

      • Fadelis01

        A mirror “strip” could have multiple states at that point and not just binary the polarization tint lines could look something like this for example /|/ where the two leading and trailing “/” polarization angles would denote the beginning and end of a string. And the middle angles “|” would be some defined but of info about the object the mirror is attached to. The scanner could trial multiple angles of sweep until it reads a full strip. At that point, it could infer object type, angle, position and distance from the scanner.

        • Fadelis01

          You could even have one contiguous mirror “halo” with a polarization “Descriptor phrase” going all the way around. At that point, the scanner simply finds the reflection, then finds the angle where it can “read” and the phrase that is visible could also state the rotation markers for the object… The mirror could utilize the same “microsphere” technology used in reflective safety paint and the Polarization could be a simple plastic polarizing strip that would just “glue” on top of that painted finish. This would be very cheap and robust.

          • Fadelis01

            The paint idea brings up other thoughts… what is there were phosphorescent “dots” that could be charged with a low level UV stobe. An optical camera could then broadly view the space and direct a more precise lazer to the right points of interest. This would still require no power or logic from the tracked objects, but would further protect the users from getting blinded by lazer scanning.

        • Fadelis01

          AND if the polarization phrase was done via a transparent LCD overlay, then button/trigger info could be relayed via the same positional scan back to the room scanners…

  • MosBen

    What is the value in a greater range? With the Vive at room scale everyone’s concerned about not having enough space for VR. What does a greater range allow us to do that I’m missing?

  • OgreTactics

    How precise is it? I mean: how small can be the reflective marker and still be tracked by this system?

    Because if precise/small, then you could very well have a 3 laser array projected by the mirror onto a surface with 3 tiny markers side-by-side for xyz movement/orientation tracking. But of course you ultimately would have to use different marker areas and MTS to track the whole 360° movements of an object.

    Anyway, I’m convinced this can be WAY smaller and cheaper than lighthouse. In fact I don’t down to how small it can be reduced (until it can be integrated into the headset) which lidars will never be.

  • Eric B

    i wish this tracking system was available .Any chance Jack is still working on this?

  • Nice post, thanks for sharing.
    News Health Education Entertainment Technology Writing