This week Google revealed Project Starline, a booth-sized experimental system for immersive video chatting, purportedly using a bevy of sensors, a light-field display, spatial audio, and novel compression to make the whole experience possible over the web.
This week during Google I/O, the company revealed an experimental immersive video chatting system it calls Project Starline. Functionally, it’s a large booth with a big screen which displays another person on the other end of the line at life-sized scale and volumetrically.
The idea is to make the tech seamless enough that it really just looks like you’re seeing someone else sitting a few feet away from you. Though you might imagine the project was inspired by the pandemic, the company says the project has been “years in the making.”
Google isn’t talking much about the tech that makes it all work (the phrase “custom built hardware” has been thrown around), but we can infer what a system like this would require:
- An immersive display, speakers, and microphone
- Depth & RGB sensors capable of capturing roughly 180° of the subject
- Algorithms to fuse the data from multiple sensors into a real-time 3D model of the subject
Google also says that novel data compression and streaming algorithms are an essential part of the system. The company claims that the raw data is “gigabits per second,” and that the compression cuts that down by a factor of 100. According to a preview of Project Starline by Wired, the networking is built atop WebRTC, a popular open-source project for adding real-time communication components to web applications.
As for the display, Google claims it has built a “breakthrough light-field display” for Project Starline. Indeed, from the footage provided, it’s a remarkably high resolution recreation; it isn’t perfect (you can see artifacts here and there), but it’s definitely impressive, especially for real-time.
Granted, it isn’t yet clear exactly how the display works, or whether it fits the genuine definition of a light-field display (which can support both vergence and accommodation), or if Google means something else, like a 3D display showing volumetric content based on eye-tracking input. Hopefully we’ll get more info eventually.
Once hint about how the display works comes from the Wired preview of Project Starline, in which reporter Lauren Goode notes that, “[…] some of the surreality faded each time I shifted in my seat. Move to the side just a few inches and the illusion of volume disappears. Suddenly you’re looking at a 2D version of your video chat partner again […].” This suggests the display has a relatively small eye-box (meaning the view is only correct if your eyes are inside a specific area), which is likely a result of the particular display tech being employed. One guess is that the tech is similar to the Looking Glass displays, but Google has traded eye-box size in favor of resolution.
From the info Google has put out so far, the company indicates Project Starline is early and far from productization. But the company plans to continue experimenting with the system and says it will pilot the tech in select large enterprises later this year.