Lyrobotix is developing a portable outside-in positional tracking system for mobile VR headsets that uses a combination of ultrasonic and Lighthouse-like tracking.

While big names in the VR space hope to create a robust inside-out positional tracking solution for mobile VR headsets, one company wants to simply reduce the cost and friction of what we already know works well: outside-in tracking.

Lyrobotix, a Beijing-based startup, is building an outside-in tracking system for mobile VR headsets that’s worth keeping an eye on. The system, which works (for now) by attaching a tracking sphere to your VR headset of choice, relies on a single, battery powered emitter which shoots out sweeping lasers as well as ultrasonic pulses to achieve positional tracking. The sweeping laser approach is clearly inspired by Valve’s Lighthouse technology (right down to the design of the emitter), but by my understanding is of Lyrobotix own design rather than a licensing of Valve’s tech.

lyrobotix-vr-postiional-tracking-4

The lasers and ultrasonic waves are picked up by the small tracking sphere which is covered in receivers. In the prototype demo I saw this week at VRS 2016, the sphere was plugged directly into a Gear VR headset for both power and data relay.

lyrobotix-vr-postiional-tracking-1

With the emitter placed on a shelf in front of me (which has a supposed 5 hours of battery life), I put on Gear VR and walked the system through its paces. I was presented with a 3D Tetris game where the Tetris pieces were scattered to my left and right, and I had to use the motion controller (built around the same tracking sphere) to put pieces in the right position to win the game. Two controllers are supported, though I only used one during my demo of the prototype.

SEE ALSO
Huawei VR Controller Appears in FCC Filing, Suggesting Mobile VR Headset Launch is Near

As I moved my head and body around the tracking space, it responded appropriately to my movements and I didn’t spot any ‘bending’ (where the system thinks I’m moving slightly toward it as I move my head perpendicular to the emitter). With only a single emitter in the front, the tracking space was not room-scale, but felt roughly on par with a front-facing Rift setup (giving me a step or two in any direction).

While the latency of this prototype leaves much to be desired, the accuracy of the system impressed (despite feeling like positional movements were constrained to a grid of small quantized points), and so did its usability. Despite the latency, the tracking was certainly functional, and turning to one side or the other to reach out to grab and manipulate the virtual Tetris pieces really did work. This made me curious about how much potential the system has to improve as it continues to be refined toward a consumer product.

lyrobotix-vr-postiional-tracking-3

Lisa Zhao, COO of Lyrobotix, told me that the prototype system was not yet fusing IMU data, so the positional tracking I was seeing in the demo was relying entirely on raw data from the emitter and tracking sphere, which only updates at 60Hz. For big headsets out there like the Vive and Rift, the onboard IMU is sampled at very high rates (in the hundreds of Hz) and used to determine the movement of the headset (with the addition of prediction), and then that data is fused with a slower outside-in tracking system to make absolute positioning corrections to prevent drift (much like using GPS coordinates to repeatedly correct small segments of dead reckoning). If Lyrobotix is able to achieve the same sensor fusion with Gear VR’s IMU (and they say they will be able to), it could mean a big improvement over the unfused performance I saw on the prototype (and may also be an answer to the quantized positional behavior I noticed).

SEE ALSO
HTC Announces Pricing, Pre-order Date & Specs for Vive Focus Standalone Headset

I’m still skeptical about the value of combining ultrasonic tracking with Lighthouse-like tracking—after all, it seems like one of the two should be sufficient—and I’ve reached out to the company for clarity on this.

Provided there’s room to reduce the latency, Lyrobotix’s positional tracking solution could be an interesting option. The ease of setup is so simple—place emitter in front of you, turn on switch, put on headset—that it could be a good stopgap until mobile inside-out positional tracking is affordable and widespread, and could even thereafter remain a practical solution for positional tracking on low-end mobile systems.

Zhao says that the company is rapidly moving toward the launch of a development kit of the system. At the beginning of December, Lyrobotix plans to solicit interest in the dev kit and will give a small batch away for free, with following dev kits priced at $100.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


  • David Herrington

    This seems to be an interesting solution and may be worth something in the future, but with other outside-in solutions already readily available I think this may be unnecessarily redundant. Sadly, I think no one will be interested in this as it would just segment the market even more.

    • DiGiCT Ltd

      The interesting part is that they working on a solution for mobile VR, so I would not say nobody interested .
      However the problem which mostly occur with Chinese companies is that they try to bend around patents and re-engineer even a wheel again.
      The gap in the market s for sure there for mobile VR, however HTC could easy fill that gap to sell their lighthouses better priced together with controllers and a HMD module that has lighthouse tracking.
      I am waiting for such an HMD for mobiles that just works with my lighthouses using daydream lol.

      So yes I agree for what you say already existing tech, lighthouses could do that job as its not needed to connect to anything else as a power supply and makes it a good universal tracking system.

  • psuedonymous

    “I’m still skeptical about the value of combining ultrasonic tracking
    with Lighthouse-like tracking—after all, it seems like one of the two
    should be sufficient”

    My assumption would be that using a single Lighthouse-style basestation and their very small marker constellation (that sphere looks to only be ~100mm across sensor-to-sensor) the ultrasonic portion is purely for measuing the distance from the basestation to the marker. The optical component constrains possible position in X and Y (well, in spherical coords anyway) relative to the basestation but doe snot provide a useful Z value. that is what the ultrasonic ranging is doing. The ultrasonic sensor could not be eliminated without adding another method of measuring the Z value (either a second basestation, or a much larger marker).

    • benz145

      Ah that sounds like a plausible theory. Lighthouse can do full 6 DOF with one basestation, but it’s possible this company isn’t able to do so reliably with their laser-based approach. I wonder then if ultrasonic can have the latency needed to be “good enough”; It has been generally panned as a tracking technology that can work for VR. It would be weird to have slowly latency in the Z direction than X/Y!

      • Val

        For those of you born yesterday, Logitech had a 6-dof ultrasonic head tracker in the early 1990’s. This tech is not scalable to more than about 5 feet if you understand physics. They pop up on eBay every once in a while…

        • 张道宁

          Thanks for the comments :) Our reasons for using ultrasound in our solution are as followed:
          1. The device can be made to be smaller – more portable. (lighthouse needs receiving sensors to cover a large area, and thus their headset and controllers are relatively large)
          2. Our costs are kept low – higher affordability. We believe that mobile VR interaction solutions should not cost over 100 US dollars. (Lighthouse rely on high precision hard drive motors, large amounts of signal processing circuitry, FPGA chips and high processing power chips. Their entire solution does not match the cost structure of mobile VR)
          3. Our solution uses optic and acoustic signals. The precision is on par with PS VR with a longer range at 4*4 meters or 13 * 13 feet.

          • John D Willimann

            We could possibly be interested in such a device if speed is improved.

    • Sponge Bob

      Which brings a serious question:
      Just how large a marker(s) is needed for a single lighthouse station or single rift camera to give any precision in Z direction at some distance say 3 to 5 meters?
      Any objective figures available (other than someone’s VR experience and marketing bs )?
      Without figures this is just empty talk
      My personal guess is that both
      Single lighthouse and rift camera are horrible at Z direction at 5 m distance

  • Mr. New Vegas

    Please ask them if the controllers are PC compatible and when they will selling them.
    People with non Rift/Vive HMD need a good reasonably priced pair of VR motion controllers.

    I dont care about their tech for mobile, but ill gladly buy a pair of controllers.

    • 张道宁

      Thanks for ur interest! Although Mobile VR is our priority, we’ll support PC as well.

  • WOOT! Finally! I’m glad somebody figured this out. I’ve been trying to talk some programmers I know into using a second cellphone to track a GearVR optically and relay the info over Bluetooth for nearly a year now (with no luck), but this can do the same thing, better, and even includes hand controllers! I am getting one of these kits as soon as possible! I hope they can work in some blueprint-level support with the Unreal 4 Engine (maybe via OpenVR?). I could start developing with this right away!

    • 张道宁

      Thank you for ur interest ;) We’ll provide a SDK, and will look into the possibility of providing more blueprint level support.

  • Andrew Jakobs

    I’d rather bet on inside-out tracking, no need for any external ‘basestations’ necessary, so the setup is even easier.. It’ll be like the lasermouse principle, those weren’t that good in the beginning, ball based mice were better, but it has come a long way, and now lasermice are much MUCH better then ball-based ever where…

  • Interesting for business applications (training, exhibitions, etc…) until inside out tracking will be a reality for wireless headsets. I can see a present but not a future for ths interesting startup

  • Tux Topo Topo

    Wow!! It’s awesome piece of tech!!
    The tracking, is fully 360? What about oclusion between body user and controllers?
    Any ideas on when will be available?
    IF (big if), can be a cheap technology, and can do all the good things, will be a beast. Also, if (even bigger if) can be connected to a PC, and mixing with it with software RiftCat, will be an awesome replacement for the vive/rift…. CANT WAIT TO THIS!!!

    • Tux Topo Topo

      So… studing about the technology, can’t be saved from oclusion. The only way, could be having 2 (or more) base stations, so the ultrasonic beam (also the optical tracking) can reach the device from all angules…
      But… using ultrasonic beams from oposed direction, i’m afraid they will nullify, making the entire ulstrasonic technology useles… How far i am from truth?

      • kindfox

        you are correct , they can only support one SINGLE base station for now … as I know , their engineers in Beijing can not even do multiple base-station debug in one room :)

  • Dave Leack

    Don’t ever be too skeptical about sensor fusion. Using the correct mathematical approaches, adding sensors can only increase accuracy. Sure there are pitfalls, but it’s akin to asking two observers about an event. Errors and limitations (range, noise, stability) in the information from one observer can be overcome by factoring in the information from another, and another, and another until the limits of the approach are reached.
    Most systems already fuse sensor readings (magnetometer and gyro for example) where one gives a noisy but non-drifting result and another gives a less noisy but drifting result. The final output minimises drift and noise.

    This assumes the fusion is done competently, using the correct maths though. So that’s why *some* skepticism should remain until you see it in action.

  • Dave Leack

    Another point, for anyone who thinks this will “fragment the market further” I have a counter-proposal. According to the advertising on FB, the NOLO devices will be available on kickstarter for less than $100 for the first few backers. This probably means that the final product might be around $150 for two motion tracked controllers and head tracker.
    For anyone who hasn’t got the £700 for a Vive, other, more homebrew options have been sufficient. vRidge/Riftcat and PSMoveService provide an alternative that’s reasonably compelling without breaking the bank (provided you already have a phone with a decent resolution). The difficulty with PSMoveService though is that camera setup and calibration is absolutely key to getting a good system up and running, and it’s pretty tricky. Lighting conditions affect the quality of the tracking and strapping a PSMove controller to your HMD adds a *lot* of weight on just one side unless you’re up for a lot of soldering and messing about.
    I’ve been desperate for a third party solution to motion tracked controllers with head tracking that is halfway decent and inexpensive, and if this is it, I’ll be over the moon.
    The market is *already* fragmented and controlled by companies unwilling to reduce their (IMHO) oddly high prices. I understand all about recouping research costs and so on, so I’m sure they all have arguments for how expensive their kit is, but it’s out of the range of a lot of people.
    I’ll take a little more market fragmentation if the new entrant is easy, good and cheap to set up. I’ll take that right now please. Yesterday, if possible.