X

Image courtesy Sony

Sony Experimenting with Advanced Livestreaming Features for PSVR, Mixed Reality Views

    Categories: GDC 2018Mixed RealityNewsPlayStation VR

Speaking during a session at GDC 2018 last month, Richard Forster, Senior Team Lead at Sony’s Research & Development West group, chronicled a range of explorations in advanced livestreaming and sharing features for PSVR.

Today PSVR users can access the usual PS4 broadcasting tools. Using the SHARE button, you can capture screenshots, videos, and livestream gameplay showing your view from the ‘Social Screen’, the cropped first-person view shown on the TV as you play. But PC VR users have had a leg up thanks to access to tools which allow for things like a picture-in-picture view of the player, and even mixed reality views where games allow players to composite themselves into the action.

In the session, Forster said that the livestreaming experiments he was going to share were not yet part of the company’s SDK roadmap for PSVR, though the session made it clear that Sony is interested in finding new and improved ways for users to share their PSVR gaming sessions with audiences.

Forster identified several different modalities which the experimental tools are positioned toward.

The first is what he called “easy access broadcasting,” the kind of low-production livestreaming that a single user could manage in their own. The approach is similar to what PSVR users can do today with the SHARE function, except this mode could allow the player to position a virtual camera within their game world, rather than only broadcasting the first-person Social Screen view. Forster suggested that the first-person view could be modified for a better sharing experiencing, including the ability to re-render the view for proper fullscreen output (rather than relying on a cropped and distorted version of the view rendered for the headset). A steadicam mode, which would smooth out shaky head movements could also be applied, in addition to post effects to change the look and feel of the output.

Another modality, which involves a bit more production, would be to have a ‘producer on the couch’, a second user who would act as the livestream director. The second user could be sitting on the couch next to the VR player, and use a controller to adjust camera views and make other adjustments in real-time, potentially also offering on-the-spot commentary.

Forster explained that the R&D West team had built an internal Air Hockey VR demo to play with some of these ideas and see what else could be done to enhance the streaming experience.

As they began experimenting with third-person camera views, it became clear that tweaking the way the player is rendered to the third-person camera would make for a better live streaming experience. For instance, while the game alone wouldn’t have required it, Forster suggested that adding some kinematics to the avatar to make them move more realistically would make the third-person perspective look better to viewers. Fake eye-movements, like blinking and object tracking, were also something that he noted could make avatars look more believable and interesting for viewers.

Image courtesy Sony

The team also experimented with enabling some of the cool mixed reality composite mixed reality views that are popular for high-production broadcasts of PC VR content. Beyond just compositing a subject into the game using a green screen, Forster said it would also be possible for the game to output a mask which could be used to make players appropriately appear behind content in the game world, rather than just being plastered on top of it (as seen in the image above, where the player is ‘on top’ of the air hockey table instead of behind it).

Of course, compositing and masking take additional resources and could impact the performance of the VR game. Forster suggested that a second PS4 Pro could be employed which would be dedicated to generating output for real-time broadcasting and compositing, which could potentially include 4K and HDR output, additional shader effects and more.

Image courtesy Sony

In addition to sharing gameplay, Forster said that the team was motivated to explore these advanced sharing functions to help developers with promotion of their games, including the ability to output production-ready livestreams for professional streamers, and helping users capture cool moments in VR games to share online. Trade shows too, he said, would benefit from these advanced functions, giving Sony and others better ways to show off what players are seeing inside the headset while other players wait in line to try it for themselves. He also said that the tools could be helpful for the production of VR trailers, giving developers better ways to show potential players what it’s like to play the PSVR title.

Sony was showing off a number of these experiments on the show floor at GDC 2018 using various internal demos, but haven’t committed to which might reach the public.