The FIFA World Cup finals are nearly here, and while we still live in a time where most everyone interested in the France vs. Croatia match will be glued to a TV set, researchers from the University of Washington, Facebook, and Google just gave us a prescient look at what an AR soccer match could look like in the near future.

The researchers have devised an end-to-end system to create a moving 3D reconstruction of a real soccer match, which they say in their paper can be viewed with a 3D viewer or an AR device such as a HoloLens. They did it by training their convolutional neural network (CNN) with hours of virtual player data captured from EA’s FIFA video games, which essentially gave the team the data needed to ingest a single monocular YouTube video and output it into a sort of 2D/3D hybrid.

Researchers involved in the project are University of Washington’s Konstantinos Rematas, Ira Kemelmacher-Shlizerman (also Facebook), Brian Curless, and Steve Seitz (also Google).

There are a few caveats currently that should temper your expectations of seeing a ‘perfect’ 3D reconstruction that you could watch from any angle: currently players are still projected as 2D textures, positioning of the individual players is still a bit jittery, and the ball isn’t tracked either—an indispensable part of the equation that’s coming in the future, the team says. Also, because it’s based on single monocular shots, occlusion is an issue too, as players’ movements are hidden from the camera and their texture disappears from view.

The implication of watching a (nearly) live soccer match in AR is still pretty astounding though, especially on your living room coffee table.

Image courtesy University of Washington, Facebook, Google

“There are numerous challenges in monocular reconstruction of a soccer game. We must estimate the camera pose relative to the field, detect and track each of the players, re-construct their body shapes and poses, and render the combined reconstruction,” the team writes.

Viewing live matches won’t be possible for a while either, the team says. To watch a full match on an AR device such as a HoloLens, the system still requires a real-time reconstruction method and a method for efficient data compression and streaming to deliver it to your AR headset.

SEE ALSO
Magic Leap Could Call its First AR Headset 'Magic Leap One'

Because the system relies on standard footage, it represents a sort of low-hanging fruit of what’s possible now with current capture tech. Even though it’s based on 4K video, there are still unwanted artifacts such as chromatic aberration, motion blur, and compression artifacts.

Ideally, a stadium would be outfitted with multiple cameras with the specific purpose of AR capture for the best possible outcome—not the overall goal of the paper, but it’s a definite building block on the way to live 3D sports in AR.

The team’s research will be presented at the Computer Vision and Pattern Recognition conference, which is taking place June 18-22 in in Salt Lake City, Utah.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


  • JoelGriffinDodd

    Seriously impressive work. I love the fact that we could add virtual 3rd-person views to sports alongside the already common idea of virtual 1st person viewing from the stadium seats or sidelines. Field based sports seem to really have the ‘killer app’ potential to bring AR and VR into the living room for Joe Public.

  • Jerald Doerr

    Yeah at 1st I was like… uaa ok.. way to much work for such a small viewing angle… but wow.. good job.. if stadiums set up Special cameras (around 30 of them) with higher speed shutters to remove the motion blur plus using there techneek in real time you would have a moving matrix shot with frames filled in the gaps in VR…

  • JJ

    Ever sense people talked about watching sports in vr this is all i can think of. I’m super happy its starting to happen.

  • Shawn Flanagan

    Reminds me of the Red Bull Air Race Flight Deck. It’s a HoloLens experience with the same concept: watch a live event in AR in front of you.

    There’s a YouTube video from Unity’s Vision Summit 2017 where the Rewind (the agency behind it’s development) goes into the details of creating it. Just search for “Vision 2017 – Red Bull Air Race MR: How to Deploy Unity VR skills for HoloLens Development” in YouTube to find it.

    https://uploads.disquscdn.com/images/17db6da6616892abb5baf6186f94443ea93fb0666f0b42685d83c246fc6e790f.png

  • Sandy Wich

    You know that’s actually a really cool idea. Imagine someday a real-time AR scan of a live event? Basically watching a live sports game on the floor/table of your house?

    Idk what kinda tech would be needed to make that happen, but i’d be game.

  • brian solomon

    Interesting that they trained the models based off of video game footage and not actual footage. If they scanned players prior to games (or used the game models of the players) and processed each game with something like openpose, I could see this being relatively common place. Still doesn’t really capture the nuances of the actual footage but it’s cool. We have to start somewhere!