New Unity Tools Bring Interactivity and Real-time CGI to 360 Video, Now in Beta

35

360 video combined with real-time CG and interactive elements was demonstrated on stage at today’s Vision VR/AR Summit 2017 keynote. Using two layers of video with 3D elements in between, it is a simple way of enhancing standard 360 video footage to make it interactive.

Natalie Grant, Senior Product Marketing Manager of VR/AR/Film at Unity, showcased an interactive 360 video today produced by VFX studio Mirada, built using Unity 2017 Beta (see the video heading this article). Captured by a 360 camera placed under a gazebo, an animated dinosaur appears in the park outside. Looking up, birds animate in the sky. Both are real-time CG elements, and are convincingly occluded by the structural beams of the gazebo.

This is achieved by playing two layers of video simultaneously; the outer sphere plays the original 360 video, and the inner sphere uses a custom ‘alpha mask’ shader, to make everything but the gazebo structure transparent. Animated 3D objects like the dinosaur and birds can be placed in between the two spheres, resulting in an effective, inexpensive illusion of depth.

The demo also illustrated how text or markers can be positioned in 3D at places of interest, used to perform a ‘gaze-based locomotion’ effect, which simply swaps one 360 video one with another perspective, giving the feeling that the viewer has moved through the scene. Further realism-enhancing techniques were shown: replacing the sun captured in the original footage with a real time light source such that animated birds disappear (due to overexposure) when flying over across the brightest spot, and real time lens flares.

SEE ALSO
Colin Farrell Narrates VR Series 'Gloomy Eyes', Debuting at Sundance 2019

Unity says that compelling interactive content using standard 360 video should be possible using these simple techniques, and it is already available to try in the Unity 2017 beta.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


  • Gaspar Ferreiro

    that is super cool. I usually do not dive into 360 videos (which many consider to not be true VR), but this kind of makes it a much more appealing option :)

  • Eddie Offermann

    They mentioned the 2017 beta – but that was a little misleading: this was 100% developed (and demonstrated above) in 5.6.0f3. (I’m the developer). That said, they’re clearly putting a lot of effort into really beefing up for Unity 2017 – additional optimization, the inclusion of TextMeshPro (which I used, though as an imported asset rather than native support), and of course the continually improving VR integration.

    So you don’t have to use the beta to build any of this functionality – as long as you’re using 5.6 (so you have their new VideoPlayer component) you’re good. I’ve even done something similar using EasyMovieTexture.

    • Guygasm

      Will Unity support something akin to “deep compositing” utilizing Facebook’s/Otoy’s volumetric 360 video? Similar to this but with automatic alpha mask per frame and much more depth resolution.

      • Eddie Offermann

        This seems likely. There’s a lot of ongoing collaboration between OTOY, Unity and Foundry – and the rendering guys at Unity are working their asses off in about a dozen different directions.

    • Is there a template / asset that’s going to be released showing this for Stereoscopic 360 video?
      360 video with CGI will look awkward if the video is mono. or was this already stereoscopic? (the gazebo video)

      • DC

        Actually like to know this as well. Having delivered our first live action stereo 360 with stereo animations, I can say it was a ridiculous amount of work to get these two things to exist together without causing headaches or sickness.

      • Eddie Offermann

        There is! Expect to see an asset in the asset store soon (next day or two), as well as a case study shortly thereafter on the Unity site.

        The latter is true-ish, but not necessarily so. For mono source the stereo pair can be set up with an IPD of 0 or the camera setup modified to render the scene once from the center camera and provide lens-compensated versions to both eyes.

        This particular setup *does* allow for multiple layers with different mattes – used with mono footage it could be used for creating a cheat for dimensionalizing footage similar to how 2d footage is dimensionalized for 3d presentation. It would be tedious and labor-intensive, but we do a lot of tedious and labor intensive things in the name of creativity.

    • Alex Butera

      Would you say that the VideoPlayer component works better than EasyMovieTexture as far as 360 video goes?

      • Eddie Offermann

        I’ve used EasyMovieTexture a lot and love it – but VideoPlayer is a Unity-native component. It’s still under active development, but the goal is to run on all Unity target platforms and it provides automatic (optional) transcoding to optimize for your target platform. That’s a pretty big deal.

        It’s very snappy – I haven’t done competitive profiling between the two but I feel like VideoPlayer performs better at higher resolutions. I don’t believe EasyMovieTexture supports hardware acceleration – I could be wrong – but VideoPlayer definitely does, which is a big deal for h264 which most phones can decode in hardware.

        The biggest advantage for VideoPlayer going forward, though, is that it’s not just a playback component – it’s an extensible video playback architecture. Support for additional codecs – including custom and proprietery codecs – can be added.

        VideoPlayer also includes native support for transparency in video codecs that contain transparency. That’s pretty cool too, since some codecs support multiple additional channels of video that transparency can be stored in. So even where performance might be competitive between the two, VideoPlayer wins on architecture.

        • Alex Butera

          Interesting, thanks for the reply. Though I’m quite certain that EasyMovieTexture uses HW acc. atleast on Android.

          • Eddie Offermann

            You may be right – I hadn’t given it much thought, but EMT is, after all, built over ffmpeg which supports HW acceleration natively if built to do so.

        • Sorry for jumping in here but you look like just the person to ask – I’ve done a fair bit of forum searching without success – is there a simple way of setting up stereoscopic view with the new Unity video component? It’s easy with Easy Movie Texture as you can assign multiple target materials and tweak the uvs according the the stereo format. I can’t see this in Unity video. So I create two video components, but at runtime the uvs are reset to defaults for some reason so I can’t split the image.

          • Eddie Offermann

            Probably the easiest answer to that is that we’ll be sharing a sample project soon (on the asset store as a free asset) that has some interesting shaders in it (among other things).

            I just verified that resetting UVs is a behavior when you use Material Override. That feels like a bug, to me, since there’s no way to set tiling & offsets in the VideoPlayer component itself… But then, there’s only a single MaterialOverride available – so you wouldn’t be able to set different eyes from the same component. (You’d probably also need to set up a script to monitor playback between multiple VideoPlayer components to constrain sync between them, particularly if it’s a stream.)

            My personal favorite solution at the moment is to do it entirely in the shader, on a single material, on a single sphere and to shift the UV space dynamically based on which eye is being rendered.

          • Thanks Eddie, appreciate the feedback.

          • donbox7

            Hey Eddie, I am a Unity noob, Waiting with anticipation for the sample-project. If its already on the asset store, please share the link. Thanks

          • Eddie Offermann

            Soon – likely in the next couple days. I was waiting on Unity to finish negotiating for the release of part of another paid asset as part of this one (one of the dinosaurs used in the demo) – not wanting to wrap one of their assets up and distribute it separately for free.

            There’s a lot in the demo project as well that I’m writing fairly extensive documentation for. Even though there are many considerations that I’m glossing over, they each need a few words. For instance, another commenter here mentioned the lens flares and (as is likely to be a common reaction) opposed them – so I spend a page or so illustrating what we’re doing there and why (hint: it’s not really “adding lens flares”).

            There’s a lot of code that I hope people pick up and use, both as-is, and re-purposing it entirely.

          • donbox7

            Hi Eddie, is the sample project released already, thanks~

  • Wow, cool!

  • Jim Watters

    Cool. Except for lens flare. Don’t add lens flare. I don’t see lens flare when I stare into the sun, why should I see them when I am in VR?

    • Eddie Offermann

      The flare can be toned up or down as needed – but the central part of the flare creates what’s called “light wrap” in vfx compositing. And as it turns out – you do see that when you look towards the sun. It’s what keeps you from being able to see small items that pass between you and bright light sources, and what makes the light appear to partially obscure or “wrap” those foreground objects.

      Lens flare can be bad – but light bloom (which Unity’s “bloom” doesn’t really reproduce here) is a real part of how we see light. If you’re not adding something like that to 360 footage with bright light sources, you can’t realistically add cg in front of it (and in my opinion, it looks awful without it – camera sensor response at bright sources is more unrealistic than even the gaudiest of lens flares)

  • Andi Pühringer

    Great Demo. Thanks for sharing. I love seeing Unity to actively support enhanced 360 video concepts. But I do wonder, why the gazebo in front was embedded as a video sphere. After all it’s a non moving object and I am sure isolating this object from the background and making the custom transparency shader work seemlessly must have been a lot of effort. I totally get the benefit of having CG objects move in front of the background 360 video. But why not insert the gazebo as a CG object as well. If you do it smartly you would have the additional huge advantage of being able to walk around in the gazebo, which should add a significant level of additional immersion.

    • Eddie Offermann

      “I am sure isolating this object from the background and making the custom transparency shader work seemlessly must have been a lot of effort.”

      If they’d teleported over to this node, you’d have had a heart attack:

      https://uploads.disquscdn.com/images/3556cfb567049faa91380c4fdee05b357e196695d5c0b4500a16d92f9f50b88b.png

      That’s the Scene view and that shadowy dome is the matte for the nearby and overhead trees.

      See where I’m going with this?

      • Fang

        are you using luma mattes, with some garbage roto? I can see how you could use luma above the horizon but below because of the shadows I’m guessing you would need static garbage mattes for the tree trunks, however with multiple mattes you could get some really nice depth. BTW does the demo release showcase any interactivity? Otherwise I’m wondering why anyone would do this in Unity as opposed to Nuke or similar

        • Eddie Offermann

          Mattes vary in origin for the demo project. Some are luminance, some are roto, most are a combination. Obviously source material will differ for everyone, so they’ll generate mattes in whatever way makes sense to them based on the source footage.

          Using non-prerendered elements is the point – whether that’s to add interactivity, make a passive experience less repetitive, or to insert non-interactive realtime data into a recorded experience.

          • Fang

            Hmmm ‘passive experience less repetitive’, Zabriskie point VR. I get it :)

        • Eddie Offermann

          Depending on the platform you deploy to, you could add more layers – I haven’t been showing that because it obviously adds effort during creation and steadily increases the performance requirements for deployment. As shown, this will run on recent mobile devices.

  • Lupe Maydana

    Thanks for the video and the post, I just have one question, if I compile to Android the video would work?

    • Eddie Offermann

      Yes – this works on Android. (iOS too) The new VideoPlayer component has a number of options for optimization and Unity does a good job of handling optimal transcoding, but as with anything else, you’re going to need to take charge of it yourself to some degree as well.

      Obviously, newer phones are better than older phones, and anything Daydream-capable is going to behave better than something that’s not Daydream capable. We are, after all, asking the phone to play back 4k video with transparency!

      • Lupe Maydana

        Thanks Eddy, I am really happy, thanks for making this feature pssible in native mode for unity

  • Wes Johansen

    Really exciting to see more video playback functions getting integrated into Unity! I’ve been working with EasyMovieTexture, and while that plugin is great, features such as transparency are lacking.

    I am wondering if the demo is available yet? I’ve searched the asset store and looked over the recent first party Unity assets but have been unable to find it. Is there a title for the project I can keep a eye out for?

    • donbox7

      @eddieoffermann:disqus hoping Eddie can help with this.

      • Eddie Offermann

        The search term will likely be “Mirada360”

        There will be an announcement shortly – I rebuilt parts of the demonstration project to make it into an easier-to-understand tutorial/base asset (what we demonstrated involved a large amount of code that was specific to how we’re working at Mirada, including code that depends on a support infrastructure and a way of representing scene descriptions that’s beyond what we’d want to release) – so that added a little bit of time before release. Simultaneously, Unity was working on agreements to include a couple of the assets that were used in the demo (assets we paid for but for which we didn’t have license to redistribute outside of a build). Unity was able to secure an agreement for one, but couldn’t arrive at an agreement for the other, which necessitated additional changes.

        Because of the type of cooperative effort this is, and how Unity would like to promote this, the project is currently undergoing internal review to ensure that everything we share represents Best Practices. Since this is coming with an official recommendation from Unity, everything needs to be “just right”.

        Since this is also a really heavy “conference season”, it’s a little harder to coordinate these things with everybody that needs to be involved. Since the unveiling at Vision Summit LA, there have been three Unite Asia conferences (Tokyo, Shangai and Seoul), Microsoft Build, LiveWorx, AWE is coming up, etc.

        We’re working hard to get this asset released soon – I’ve done some sort of tweak or sent the project to someone else for review every week since Unite. There’s a 20-page tutorial covering the details – I *hate* that I’m still sitting on it, but we’re really wanting to do everything we can to make this the right example of how to work, with full support from everyone all around.

        Just a little bit longer – I’m hoping we get the thumbs-up next week.

        • Wes Johansen

          Thanks so much for your reply! Very excited!

        • dbroggy

          it’s been 3 months since your post and I’m not seeing “Mirada360” in the asset store or anywhere. Any updates?

  • J Eleora

    How do you add the animated 3D objects in between the two spheres?