MIT Researchers Build VR Testing Ground to Safely Train Autonomous Drones

2

Training autonomous drones to fly around complex indoor environments inevitably means crashes, not to mention a constant cycle of repair and replacement. To remedy this, MIT engineers created a VR training system that allows drones to “see” virtual imagery while flying around a physically empty test facility.

Dubbed “Flight Goggles,” MIT engineers say in a blog post that the new system could “significantly reduce the number of crashes that drones experience in actual training sessions,” and also serve as a virtual testbed for diverse environments that would otherwise require physical barriers and lengthy setup time.

“We think this is a game-changer in the development of drone technology, for drones that go fast,” says Sertac Karaman, associate professor of aeronautics and astronautics at MIT. “If anything, the system can make autonomous vehicles more responsive, faster, and more efficient.”

Created initially to work in MIT’s new drone-testing facility in Building 31, the new Flight Goggles system integrates a motion capture system, an image rendering program, and number of on-board electronics including IMUs, and custom-built circuit boards that integrate a powerful embedded supercomputer.

Image courtesy MIT

The later bits are strapped onto the drone via a 3D-printed nylon and carbon-fiber-reinforced frame. The drone seen in the video (linked above and below) is said to autonomously fly at a max speed of 6.7 m/s, or around 15 mph/24 kph.

Karaman and his colleagues can import any number of photorealistic VR environments and transmit it at 90 frames per second to the drone as it’s flying through the empty facility.

Over 10 flights, the drone, flying at around 2.3 meters per second (5 miles per hour), successfully flew through the virtual window 361 times, only “crashing” into the window three times, according to positioning information provided by the facility’s motion-capture cameras. Karaman points out that, even if the drone crashed thousands of times, it wouldn’t make much of an impact on the cost or time of development, as it’s crashing in a virtual environment and not making any physical contact with the real world.

Since the environment is highly configurable, non-static barriers such as people could be introduced as well into the autonomous drone’s training regime. Karaman envisions a system whereby MIT’s drone test facility is split into two sections—one for the drone to fly around in its VR environment, and another reserved for humans wearing motion-capture suits, effectively inserting a person into the drone’s virtual pathway with no physical risk to the person.

The list of supporters of the VR drone system include the U.S. Office of Naval Research, MIT Lincoln Laboratory, and the NVIDIA Corporation.

Karaman and his fellow researchers will present details of Flight Goggles at the IEEE International Conference on Robotics and Automation this week, which takes place May 21st – 25th in Brisbane, Australia.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.