A developer at Facepunch Studios, the team behind Garry’s Mod, has whipped up a Minecraft VR prototype for the HTC Vive. Thanks to natural input devices like the SteamVR controllers, interaction and manipulation will be beyond anything gamers have seen before, completely redefining building games as we know them.
James King from Facepunch Studios has crafted a simple Minecraft prototype for the HTC Vive which enables players to stand high above the randomly generated cube terrain. In Vivecraft, building is as easy as reaching out into space and clicking to place a cube. Want a wall or box quickly? Just click and drag in two dimensions, or three if you prefer. Want to check out the inside of your creation? Just lean down and take a look inside. And if destruction is your thing… you’ll have blast throwing TNT (sorry).
If you happen to have a Vive (which is like 100 people in the world at this point), you can download Vivecraft for yourself from WEARVR.
Minecraft is a magical game for many reasons, but perhaps chief among them is the accessible free-form creation. In Minecraft, the entire game world is formed of cubes which players can destroy or place at will. Vast virtual cities have been created by players who would otherwise not have the skills to do so in something like a CAD program.
They key insight to this accessibility is that it’s controlled in a very conceptually human way: if a player wants to place a cube somewhere, they run over to that place and click a button to drop a block. Many gamers are already familiar with moving characters through a game world using standard FPS controls, which actually turns out to be a form of 6DOF navigation (the ability to move in all directions and all rotations).
But while this 6DOF movement requires a synchronized dance of keyboard and mouse (or both hands on a gamepad) that must be learned, natural input devices like the SteamVR controllers or Oculus Touch make 6DOF input as simple as moving your hand through the air—something that everyone is familiar with just by virtue of being human. It’s for this reason that natural input is going to upend not just building games—by making them even more accessible and allowing faster, more effective creation—but input for many different computer applications.
Notice the distinction I make between ‘gesture’ input and ‘natural’ input. The former is the frustrating joke of an input paradigm that players know all too well from the Wii and Kinect. With these systems, broad, specific player movements (gestures) are recognized as a button presses (often inconsistently). Pretend as you might that your motions are important, gesture input can be easily accomplished with a simple controller of buttons and sticks.
Natural input, on the other hand, is true 1:1 motion—highly reliable and accurate 6DOF input, allowing for real manipulation of a virtual environment, without the need for gesture recognition. With natural input, you can easily reach into a digital world and interact using movements that wouldn’t just be difficult to achieve with a mouse+keyboard or controller, they’d be impossible.