NASA Looks to PlayStation VR to Solve Key Challenge of Space Robot Operation

While NASA has a long history of sending probes and rovers into space, advancements in robotics has made deployment of human-like robots an increasingly attractive prospect. But it turns out that controlling such humanoid robots remotely is challenging. NASA and Sony have been collaborating to explore how VR might be used to train operators to control robots in space.

While probes and drones are great at what they do, they are very specialized. The appeal of humanoid robots—those that mimic human form and dexterity—is their flexibility. Humans are amazing generalists, using our brains to achieve things that our bodies were never made for (like space travel). Part of what makes us so adaptable to diverse situations is our bipedal stance which frees up our arms, hands, and fingers for tasks (instead of locomotion), and the use of tools. Our hands can grip and manipulate a breadth of forms unmatched by machines—but robots are quickly catching on.

NASA’s Robonaut 2 is a humanoid robot designed with arms, hands, and fingers that move just like ours. But designing dextrous robots for space is only half the challenge to actually making them useful.

While NASA is highly experienced in controlling probes and rovers with carefully planned math-based maneuvers, human control is quick and intuitive; Robonaut 2’s human dexiety is wasted if commands can’t be executed with human fluidity and improvisation.

Sony’s Richard Marks demonstrates simulated control of a robot in space.

So NASA is exploring how to control humanoid robots with human input. Modern virtual reality, as it turns out, may provide the best way to do just that—by making the robot mimic the input of a remote operator—and NASA collaborated with Sony to create a PlayStation VR tech demo called Mighty Morphenaut to explore how this might work.

“The hope is that by putting people in an environment where they can look around and move in ways that are much more intuitive than with a mouse and keyboard, it would require less training to understand how to operate the robot and enable quicker, more direct control of the motion,” Garrett Johnson, Software Engineer at NASA’s JPL told me.

The demo is pure simulation, running in real-time on a PS4, but it’s built to replicate the challenges that would actually come into play for humanoid robots in space, including the robot’s range of motion and the dreaded time delay.


So long as the human operator is sufficiently tracked—in this case using the PlayStation VR headset and Move controllers—robot control is actually quite simple as the robot can mimic the motions of the operator given its humanoid design. But even if the robot could keep perfect pace with the operator, the distances involved can introduce communication delays that cause lag between the operator’s input and the robot’s movements.

Compensating for this time delay is a huge challenge for effectively controlling humanoid robots in space when the operator is back on Earth. So the Mighty Morphenaut demo integrates a time delay mode where the user’s sees ‘ghost’ hands that move instantly, while the actual movement follows along after the fact.

“I’m pretty good at it because I’ve done it a lot,” said Richard Marks, head of Sony’s Magic Lab. “Usually when we put people in [at first with the time delay] they can’t do anything.”

And that’s exactly what would be hoped for; the instant feedback helps the operator make use of innate hand-eye coordination, and enough time training with the system can make a huge difference in their ability to compensate for the time delay, as Marks demonstrates.

See Also: The Gulf Between High End Military VR and Consumer VR is Rapidly Shrinking

Marks told me that while this demo is a simulation, it should be entirely possible to overlay the ghost hand visualization onto real footage, making this technique a possible solution to one of the key challenges of operation dextrous space robots effectively.

But they’ll have to go further before it’s perfect. Johnson notes that one major piece of feedback from the tech demo is that the ghost hands enhanced the understanding of movement, but interaction was still difficult for objects in motion.

[With Mighty Morphenaut] we were able to explore a possible solution and I think our application worked well to demonstrate the problems of operating with delayed communication,” he told me. “However, even in our simulation, there are a still a number of problems to solve. With time delay, anticipating the motion of a floating object makes it nearly impossible to interact, so further research might include ways to help users predict that kind of motion.”

NASA, as it turns out, has been readily experimenting with modern VR tech to solve challenges relating to space exploration. The organization has explored uses for the Oculus Rift, Virtuix Omni and Microsoft’s HoloLens, among others, and of course has a long history of using VR systems of an earlier era for training.