Tom Furness has been pioneering virtual and augmented reality for the past 50 years, longer than almost anyone else in the world. He has an amazing history that started back in 1966 while he was in the Air Force building some of the first helmet-mounted displays, visually-coupled systems, and eventually the Super Cockpit. Furness eventually left the military to “beat his swords into plowshares” and bring these virtual reality technologies to the larger public by starting the Human Interface Technology Lab at the University of Washington, which has been doing original research to validate the efficacy of VR for everything ranging from medicine, education, and training. He also helped invent the virtual retinal display technology in the early 90s, which is being used as some of the basis of Magic Leap’s lightfield display technologies. Tom has continued to be a virtual reality visionary, and has some pretty inspiring ideas the future of the metaverse and education through the Virtual World Society.
LISTEN TO THE VOICES OF VR PODCAST
Tom Furness is a rather incredibly visionary and pioneer of virtual and augmented reality. He talks about the early days of virtual reality where he was one of three people involved in helping to create a new immersive medium that’s now known as virtual reality. Morton Heilig’s Sensorama was one of the first immersive multimodal experiences, and Ivan Sutherland’s Sword of Damocles was in development at the same time that Tom was working on some of the first helmet-mounted displays for the Air Force from 1966 to 1969. Tom was serving in the military as Chief of the Visual Display Systems Branch, Human Engineering Division of the Armstrong Aerospace Medical Research Laboratory at the Wright-Patterson Air Force Base in Ohio.
Tom says that he had different motivations for getting into virtual reality than Ivan. He has always been much more focused on solving real problems with VR. It started in the late sixties in trying to help fighter pilots cope with the increasing complexity of the fighter jet technology.
Some of the specific problems that he trying solve included how to use the head to aim while shooting, how to represent information from imaging sensors on virtual displays, and how to make the systems less complex to understand and operate. There was limited space in the cockpit to monitor and control both the flying and fighting, and so Tom turned to creating augmented reality systems to display more information to the pilots in a virtual environment. This resulted in the first “Visually Coupled Airborne Systems Simulator” system that he helped develop in 1971.
There’s a lot of the history and development of virtual reality that’s been fairly dark and hidden from the period of the late ’60s to early ’90s. Tom is the first person that I’ve interviewed that has been involved with the evolution of virtual reality from the very beginning. He points out that a lot the virtual and augmented reality programs he was working on were in development primarily to help fighter pilots in the cockpit first, and that the flight simulations and other training applications came after that.
Now, as then, it’s likely that there’s classified VR and AR research going on in the military space. We have to rely upon various trade magazine reports and books to start to piece together some of the details of these programs. In a book titled Virtual Reality Excursions with Programs in C, the authors write:
In the 1970s, for the first time, the capabilities of advanced fighter aircraft began to exceed that of the humans that flew them. The F-15 had nine different buttons on the control stick, seven more on the throttle, and a bewildering array of gauges, switches, and dials. Worse, in the midst of the stress and confusion of battle, perhaps even as they begin to black out from the high-G turns, pilots must choose the correct sequence of manipulations.
Thomas Furness III had a background in creating visual displays dating back to 1966. He had an idea for how to manage the deluge of information provided to pilots. He succeeded in securing funding for a prototype system to be developed at Wright-Patterson Air Force Base in Ohio. The Visually Coupled Airborne Systems Simulator (VCASS) was demonstrated in 1982. Test Pilots wore the Darth Vader helmet and sat in a cockpit mockup.
VCASS included a Polhemus tracking sensor to determine position, orientation, and gaze direction in six degrees of freedom. It had one-inch diameter CRTs that accepted images with two thousand scan lines, four times what a television uses. It totally immersed the pilot in it’s symbolic world, a world which was created to streamline the information to be presented to the pilot.
The VCASS system eventually led to the development of the Super Cockpit program in 1986 that Tom describes a system where you “put on a magic helmet, magic flight suit, and magic gloves” and then you were imported into an immersive virtual world. The Super Cockpit was the main project that Tom worked on throughout the ’80s. This brief academic article submitted to the Proceedings of the Human Factors Society in 1986 gives some of the most specific details I could find on how it included features such as “head-aimed control, voice-actuated control, touch sensitive panel, virtual hand controller, and an eye control system.”
The Encyclopedia Britannica has the following to say about the Super Cockpit:
From 1986 to 1989, Furness directed the air force’s Super Cockpit program. The essential idea of this project was that the capacity of human pilots to handle spatial information depended on these data being “portrayed in a way that takes advantage of the human’s natural perceptual mechanisms.” Applying the HMD to this goal, Furness designed a system that projected information such as computer-generated 3D maps, forward-looking infrared and radar imagery, and avionics data into an immersive, 3D virtual space that the pilot could view and hear in real time. The helmet’s tracking system, voice-actuated controls, and sensors enabled the pilot to control the aircraft with gestures, utterances, and eye movements, translating immersion in a data-filled virtual space into control modalities. The more natural perceptual interface also reduced the complexity and number of controls in the cockpit. The Super Cockpit thus realized Licklider’s vision of man-machine symbiosis by creating a virtual environment in which pilots flew through data.
Tom left the Air Force soon after the Super Cockpit program, and started to bring some of these virtual reality technologies to the wider world in what he characterizes as “beating [his] swords into plowshares.” One of the technologies that he helped to invent was the virtual retinal display technology where photons are scanned directly onto the retina. Though the patent has now expired, Tom says this is some of the technology that the mysterious Magic Leap is based upon.
In an article in Aviation Today from 2001, Tom says that they licensed out the virtual retinal display technology in 1993 so that it could be commercialized for the next-generation Super Cockpit called the ‘Virtual Cockpit Optimization Program’.
John R. Lewis, an employee from Microvision, suggests in this IEEE Spectrum article that Tom was working on similar scanned-beam technology while working on the Super Cockpit program and gives some more details as to what became of the Super Cockpit program:
The military gave scanned-beam technology its start in the 1980s as part of the U.S. Air Force’s Super Cockpit program. Its team, led by Thomas A. Furness III, now at the University of Washington, Seattle, produced helmet-mounted displays with an extremely large field of view that let fighter pilots continuously see vital data such as weapons readiness. The displayed information moved with the pilot’s head, giving him an unobstructed view of what was going on in front of him and helping him to distinguish friend from foe.
There’s a number of Microvision press releases about the continuation of the “Virtual Cockpit Optimization Program” into the early 2000s, but it’s unclear as whether or not the virtual retinal display or augmented reality helmets ever moved beyond the prototype and proof-of-concept stage into full operational use. It’s likely that the military uses of augmented and virtual reality have continued in training, but it’s unclear as to the extent to which the technology has continued to develop and potentially be used in combat.
Tom talks about how he thinks that the virtual retinal display technology has the potential to solve some of the problems around the vergence-accommodation conflicts in pixel-based virtual reality head-mounted display technologies. He also shared his surprise that shooting photons directly into the retina can actually allow some people with impaired vision be able to see, and so he started to develop some virtual retina display peripherals designed for people with impaired vision.
Finally, Tom talks about his vision for turning living rooms into classrooms with the Virtual World Society. He imagines a subscription-based program that could be used to fund free educational virtual world environments that families could explore with their children and potentially help solve some of the world’s most intractable problems.
Tom says that “When you put people into a place, you put the place into people.” If you design a virtual world well enough, then it’s possible that the people who experience it will never forget it because VR allows for a whole-body learning experience.
Tom is an visionary of virtual reality, and it was really inspiring to be able to learn more about the history of the medium and what he sees as the ultimate potential of VR: to be able to appreciate what we have in reality and to continue to learn, grow, and collaborate with each other to make the world a better place.
Become a Patron! Support The Voices of VR Podcast Patreon
Theme music: “Fatality” by Tigoolio