Google has released a free app for PC VR headsets called Welcome to Light Fields. The company says the app serves as a showcase of “the emerging technology Google is using to power its next generation of VR content.”

When it comes to capturing the real world for VR, 360 photos and videos can only go so far, and have a number of limitations which make them less immersive than computer generated VR content that’s rendered in real-time—namely the missing ability to actually move within a captured scene. Since 360 photo and video content is limited to being seen from only the precise location of the camera, you’re effectively stuck in place, except for being able to rotate your head.

Volumetric capture techniques aim to capture not just one circular perspective, but a totality of the scene (or a at least portion of it), so that viewers can move their heads through 3D space within the capture and see the scene from varying perspectives. Light fields are one promising type of volumetric capture and could represent a flexible, high quality, foundational format for capturing, generating, and storing VR content.

SEE ALSO
Google's AR Ambitions Stymied as Company Guts Hardware Team
In light fields like this one you can see how reflections and lighting on shiny objects move correctly as you move your head, instead of being ‘baked’ into an object’s texture as happens with some non-light field approaches to volumetric capture | Image courtesy Google[/caption]

Initially the scenes look similar to simple 360 stereoscopic photos, but the magic happens when you move your head through space—instead of the world being effectively ‘locked to your head’, you’ll see it move around you just like you’d expect if you were really standing there; it’s much more immersive than a static 360 capture.

Google’s custom light field camera takes about one minute for a complete spin to capture a light field scene | Image courtesy Google

It does seem like magic, but it’s not without limitations. The captures are generated in this case by a custom lightfield camera which spins an array of GoPro cameras in a circle to capture a spherical area about two feet wide—you can view the scene from anywhere inside that sphere, but if you stick your head outside of it, the world will go blank. That’s because the cameras effectively capture all of the light rays intersecting the sphere on all sides, and then algorithms recreate the view from any point within the sphere for your viewing pleasure. A larger viewing area can be achieved with a larger camera.

When you move your head outside of the captured area, the scene fades away | Image courtesy Google

Welcome to Light Fields features some very impressive light field imagery. At its best you’re seeing sharp, immersive views with lots of depth and reflections which convincingly react as you move your head. But there’s some less than stellar scenes (especially if you manually delve into the app’s gallery) that show some of the challenges of capturing quality light fields.

The inside of Space Shuttle Discovery looks astounding as a light field | Image courtesy Google

Some of the scenes are not nearly as sharp as the others, and you can sometimes spot artifacts at the edges of objects, mistaken depth information, and reflections and lighting that don’t seem to act quite right. And there’s still the big challenge of filesizes. The handful of static light field scenes in Welcome to Light Fields clock in around 6GB total—fine for a demonstration app, but arguably too large for mass adoption.

Still, Welcome to Light Fields is a powerful example of light field technology and a look at why volumetric capture is probably the future of VR video.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Davo

    Wow, this is spectacular ! Quite a few “holy shit” moments as I looked around the scenes. Do the guided tour first though before exploring on your own.

    • JJ

      Omg yes that was epic, thanks for the guided tour tip that was really good!

  • Peter Pan

    We (any enthausiastic for any vr aplicates) )applaud any of these sorts of projects and inovative views and demos for next generations to view. We have the knowledge, we have the power and we don’t need capital for clear, honest and wonderfull technology for our population to go forward, without economical gain, free to any people, not directed for governance or political or warmonger to geopolitical issues to solve. Make it so, EVERYONE on this planet prosper and YOU are the front for a free society!

  • Ian Shook

    Finally somebody coming out with actual viewable lightfields in VR. So many companies have these in their ‘lab’ but not for the consumer to test. I can’t wait to try these.

  • Ian Shook

    Maybe I missed it in the article, but the headline graphic is rendered – does that mean google also has a virtual camera rig for rendering in 3D software? I hope this is shared soon.

    • beestee

      Technically, everything needed to replicate the rig digitally is available, the trick is in the software used to interpolate all of the data and make it useful to a VR HMD.

      16 cameras
      Volume the size of a beach ball to fit in tight spaces

      Not sure if FoV or resolution were disclosed.

      The question remains, though, does file size make this irrelevant for showcasing a digital model since the raw assets to render the model likely are less heavy than the series of captures required to make the light field volume.

      • Ian Shook

        I think any good render would have just as much visual data as a photo, or just about. So I don’t think file size is going to go down in the traditional sense. I know OTOY is working on optimizing the rendering of light fields so that it’s faster than rendering 900 individual renders to make one. So far that seems to be the biggest hurdle – file sizes. The ‘welcome to lightfields’ uses the CPU, I hope a GPU powered one comes up so that we can make use of our Graphics cards that we VR people spent money on upgrading.

    • Ian Shook

      BTW to answer my own question, Google re-modeled the apartment scene by hand for use in this demo, and the apartment itself is actually photo based. The 3D modeled apartment -is- a light field, but they didn’t have any rendered light fields available to view (other than the white clay render)

      • Foreign Devil

        I”m confused by that. . does lightfield require creating a 3D model of the environment or not?

        • beestee

          No, the light field itself is a series of many still images. A 3d model can be extrapolated from that data through photogrammetry, but the light field data itself is not 3d. The interest above is in generating 6 DoF light fields of purely digital scenes.

          Google and MIT’s news posts about it go into more depth on the topic to explain how light fields work.

  • Mark
    • Ian Shook

      Thanks for sharing

  • Foreign Devil

    Should this be in Oculus store? I could not find it.

    • Ian Shook

      No – at least not yet. It’s in the steam store.

      • Foreign Devil

        Yeah I ended up getting it from Steam store. It’s great to finally experience light field tech!

  • Ian Shook

    I tried it out (although I had to have the beta enabled). It was really awesome. MORE PLEASE

  • beestee

    This was excellent. Worked great on HMD Odessey.

  • Foreign Devil

    I finally got to experience lightfield tech! Certainly the most immersive way to experience a photograph and holds so much promise for what is to come. You could create node like walkthroughs of famous architecture. .I certainly wanted to walk all through that old mahogany mansion. Even on low rez Rift the detail surpassed any purely CG render. Though from what I can gather it is kind of a CG recreation because they automate building 3D shells from all the angles of objects in that point of view and then map those geometry with the captured textures. .similar to photogammetry? Or am I wrong and it is all camera capture info with no CG models being constructed? The fact the showcase shows the room without any textures leads me to believe they auto-create 3d meshes for everything.

    • beestee

      If you look closely at the details in the intro room, there are objects missing and subtle difference in the actual objects and the 3d model. That ‘clay’ model was not generated from photogrammetry, although there is technology that can do this that is in active development.

      • Foreign Devil

        Yeah so I wonder why they went to the trouble of modelling it in 3D. I did notice some objects missing on the 3D model version. . some things on the fireplace mantle for example. But having that 3D model with an occlusion pass only. . made me think lightfield was similar to photogammetry.

  • Lucidfeuer

    Despite varpowaring Google Seurat (as of now), we at least got the demos that Otoy never ever provided despite having advertised and been invested for so long in digital Lightfields…

    Now I wonder how their set-up and most importantly pipeline fares against Lytro for example. In fact it’d be great if their was a comprehensive article about current lightfield capture/rendering technologies and the different actors.

  • william

    hey guys help me i can’t play it..how i can’t enjoy it please someone help me

  • FireAndTheVoid

    I tried this out. It’s freaking amazing. I can’t wait to see more of this.