‘Planet Whack’ is a Multi-user VR Demo Showcasing PhaseSpace Motion Tracking


A short video (heading this article) of two people wildly hitting a giant ball with foam swords (captured by Super Ventures) is an amusing showcase of PhaseSpace’s motion tracking technology. Planet Whack is a whac-a-mole-inspired game, where two players participate in the same VR space trying to hit cartoon worms emerging from a sphere, with the giant beach ball providing haptic feedback.

PhaseSpace has been a leader in high-performance motion capture for over 20 years, providing solutions for the research, industrial, and graphic arts communities. The company says the Impulse X2E system contains the highest resolution sensors ever in a motion capture camera (36,000 x 36,000 sub-pixel resolution in HDR), using customised linear detectors instead of RGB cameras. Patented active markers, which modulate their own unique codes, eliminate the common motion capture problem of marker-swap, making the system ideal for real-time tracking of multiple people and props.

Planet Whack was initially created for last year’s Siggraph conference as a fun way of highlighting PhaseSpace’s real-time tracking capabilities. Shown in this promotional video, both Planet Whack and Resistor are collaborations with TriHelix, a San Francisco-based VR developer, to create multi-user games using PhaseSpace’s motion tracking technology. Several other demonstrations are available, such as Wizard’s Duel, a VR rock, paper, scissors game and Dungeon, where players defend a crystal using swords and a staff. Also at Siggraph was Aquarium Earth, where multiple users walk through a coral reef environment. Further medical and architectural visualisations showcase other potential uses for large volume VR tracking.

Elizabeth McSheery, business developer at PhaseSpace, says that while these examples are mostly used in demonstration environments, PhaseSpace is “looking towards allowing customers to purchase access to these games when they gain access to our API SDK that makes our system work easily with the Unity game engine.”

The system uses Samsung Gear VR headsets for visual output. The usual limitations of this mobile VR solution—i.e. relatively low-performance rendering and orientation-only tracking—are solved with PhaseSpace’s technology, as the headsets are fitted with low-profile LEDs for positional tracking, and the rendering is performed on a local server, which is receiving all the positional data from the headsets and props connected to a microdriver, and is then transmitted via wifi back to the headsets, highlighted in this video.

The result is Gear VRs and props with higher-precision positional tracking than the current high-end PC VR solutions, as the Impulse X2E cameras run at 960fps with sub-millimeter accuracy (as low as 20µm). The game also renders faster than the 60Hz limit of the Gear VR, in order to minimise latency.

The virtual reality page on PhaseSpace’s website explains how the technology is suitable for events or permanent installations, due to its accuracy, scalability, and functionality even in direct sunlight. “One of the places that you can check out our system is at IMAX VR in LA.” says McSheery. “We have another customer outdoors at Santa Clara Paintball. We have other customers who are working on putting down the foundation for more permanent locations.”

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

  • yag

    – Gantz VR !
    – it’s a whack-a-mole game
    – huuuuu

  • Roman

    Shouldn’t the latency in transmission of rendered frames to GearVR cause jitter?