Amidst other VR related announcements from Google at today’s I/O 2015 conference comes ‘Jump’, a VR camera that will follow Cardboard’s path by offering up an open design. Google is also revealing the ‘Assembler’ which the company says can combine multiple video feeds into a “seamless” 3D scene. YouTube will support VR video playback for Jump videos starting this summer.

‘Jump’ is what Google is calling its foray into the world of VR video. The company has created a VR video camera design that consists of 16 viewpoints; they will be releasing the design to the public and encouraging third-parties to create ‘Jump’ cameras which will capture 3D 360 degree footage for virtual reality playback. This approach is similar to Google’s Cardboard VR viewer that the company opted not to sell, but rather chose to release the plans, allowing third parties to manufacture and sell the device.

See Also: Google Announces New, Larger Cardboard VR Viewer with Universal Input Button

Google, as others, say that they’ve experimented extensively to determine the ideal placement of the cameras for recording VR video. The 16 camera rig creates a fairly large ring of cameras, with none appearing to cover directly above or below. It may be that any top or bottom gaps are filled in computationally, as directly above the camera will often be nothing but sky or ceiling, while directly below will often be the ground or floor. We’ve seen this approach used elsewhere, especially to eliminate a tripod that might be holding the rig from the bottom.

SEE ALSO
VR MMO 'Zenith' Releases Free-to-Play Mode Open Beta on Quest & PC VR

google jump camera 360 VR camera

One problem with existing VR cameras are ‘stitching’ errors which create visible seams where the various video streams are merged together into a single 360 view. Google says they have a solution for this.

The ‘Assembler’ is their stitching service which they say uses computer vision and “3D alignment” to create a scene that has no stitching seams (a claim we’ll have to see to believe). Google says this is achieved by analyzing the scene in 3D and adapting the stitching to match. The system outputs a scene that’s “depth corrected stereo in all directions.” The company says that the high resolution output is “the equivalent of five 4k TVs playing at once.”

google jump stiching assembler

The Assembler service will be made available to “creators” this summer, though it isn’t clear if the company plans to charge for it; a possibility given the computing power involved with the process. It also isn’t clear if the Assembler will stitch only content recorded from cameras adhering to the Jump design, or if it’s a universal stitching method for any multi-viewpoint VR camera. The former may be the case, as the company notes that “The size of the [Jump] rig and the arrangement of the cameras are optimized to work with the Jump assembler.”

google-jump-youtube-vr-video

In addition to the Assembler, Google is bringing Jump VR video to YouTube. The company recently added 360 degree video support to the streaming video platform, and this summer they say that YouTube will allow users to “experience immersive video from your smartphone,” presumably through Google’s Cardboard VR viewer. Where that leaves access by other platforms like the Oculus Rift, Gear VR, and HTC Vive is still unclear.

SEE ALSO
'Stranger Things VR' Review – Artful But Boring Brand Engagement

See Also: YouTube Now Supports 360 Video, No VR Support (Yet)

project-beyond-comparison
Samsung ‘Project Beyond’ camera uses 16 cameras arranged in stereo pairs.

Similarly, Samsung is also working on a 3D VR camera that they call Project Beyond. It uses a different lens layout (including a camera that covers the top of the scene), and we’re sure both companies will battle over whose layout is best.

See Also: First Impressions of Project Beyond, Samsung’s 360 3D Camera for VR

Creators interested in working with Google’s Jump VR camera can hop in line using this form.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • AJ@VRSFX

    The JUMP concept is same-old-same-old, but the assembler looks incredible. Using computer vision to understand occlusion of near vs. far objects… I’ll be blown away if it works. The dream of non-analog stereoscopy feels too good to be true. Seems like they’re saying they can do it without IR or any depth-sensing technology. They said the processing requires “thousands of computers.” I assume that means we upload our footage to the cloud where the assembler will chew on it for a while.

    Here’s what I’m dying to know: will the assembler spit the output back out to us so that we can distribute it anywhere we want, or will we be locked into distributing it only on Youtube from there? I’d prefer to retain control over distribution (not that I wouldn’t distribute on Youtube, but I don’t like exclusivity).

    • kalqlate

      Most likely, it will be a streaming service that YouTube or any presenter/distributor/service can subscribe to.

    • lasandcris

      I saw this video on YouTube the other day (https://youtu.be/f4ThoVg2obI) which claims that you can use any consumer camera and you can shoot 3D 360 with it, you just need to upload the images to their server which then gives you the VR images back. I thought it was cool.

      • kalqlate

        Did you read the article on this page? The JUMP stitching process is different than normal 360 stitching.

        • lasandcris

          Yes, I read the page. This is exactly why I posted that link, because these guys have a very similar process to the JUMP (and I’m not talking about simple stitching here – they also use some sort of an optical flow calculation between viewing angles to perspective warp misaligning edges). But what is even more outstanding to me is that their process is automated.

          • kalqlate

            Sounds interesting. I tried to find more information on this particular technology, but wasn’t successful. If you have a link to good info, please share.

  • Simon

    I’m wondering what the orange blocks on the rear of the camera are for? Some kind a back pack?
    It’s interesting that the cameras are orientated that way, by chance (or deliberate) their IO connectors are facing down and could be connected to the rig, which would maybe a way to provide power and/or control.

  • Stray Toaster

    This is a good next step for recording VR video. I think true VR video should be shot on light-field cameras so that each eye sees a slightly different POV and the depth of field changes depending on where the user looks, via eye-tracking or a similar solution.