Road to VR last caught up with Sixense at CES, where they demonstrated an early integration with Gear VR to add positional tracking. The ‘Jedi Training’ demo now comes to GDC, running on Unreal Engine 4. Ben Lang Speaks to Creative Director at Sixense Danny Woodall on what the engine was like to work with.
If it weren’t for the conference schedule and the headlines being dominated by all things virtual reality, GDC 2015 could quite legitimately be named ‘Game Engine Wars’. With Epic’s Unreal Engine and the newly released Unity 5 announcing very loudly that they’re now ‘free’ to use and flaunting their virtual reality credentials, things just got interesting for VR developers.
Sixense aren’t picking sides though, their motion tracking controller system STEM has been well supported by software designed to show off it’s unique wireless capabilities, the majority of them based on Unity. This time it’s running on UE4 and our very own Scott Hayden went hands-on with the new build.
Sixense’s creative director, Danny Woodall, strapped me into the DK2, headphones, and handed me the two STEM motion controllers, a first for me as I had never used the motion tracking device before. The demo being shown on the expo floor was showcasing an updated version of their popular ‘Jedi Training’ simulator using Unreal Engine 4, a room where you’re given two light sabers of your choice and are immediately beseiged with laser fire from a floating orb droid. I caught several laser bolts to the face, damaging my pride more than anything else, but I quieted my restless mind and focused on the task at hand. I became one with my dual wielded lght sabers, and easily caught and even deflected a few shots back at the droid, stunning it to my ultimate satisfaction.