A project from researchers at Reality Labs, Meta’s XR division, shows a method for interacting with the real-world through augmented reality.

I’m about as seasoned as they get when it comes to the XR space. I see research papers all the time that are interesting but ultimately one more step along a predictable path. It’s rare that I come upon something that I’d never even conceptually considered. But this new Reality Labs research project is exactly that.

In fact, it’s so different that it’s kind of hard to explain.

Researchers at Meta Reality Labs Research and University of Duisburg-Essen devised a method for making real objects virtual so they can be interacted with in real-time. By not only digitizing the objects, but also seamlessly erasing their real counterparts, the method creates a mind-bending blend between the real and virtual worlds. The video below with examples explains it better than I possibly could with text alone:

The researchers—Mohamed Kari, Reinhard Schütte, and Raj Sodhi—call this concept ‘Scene Responsiveness‘ and published their work late last year at the ACM Symposium on User Interface Software and Technology.

As the authors put it, this method creates a “visual illusion that virtual actions affect the physical scene.”

What makes this so interesting? We’ve all seen AR demos where virtual objects seem to float in your room, attach to your walls, or even walk ‘behind’ objects. And we’ve seen virtual objects that seem to abide by the physics of the room by rolling across a table and then correctly falling off when they reach the edge.

But digitizing the real objects in the space to give the user virtual control over them—while erasing the real object from where it once sat—that’s a damn interesting trick. Perhaps even more powerful is when virtual characters in the scene appear to interact directly with objects in the real world.

I quite appreciate that the authors even took the next step and showed how to prevent the illusion from breaking.

Giving the user control over an (apparently ‘real’) virtualized object, but not allowing them to get close enough to actually touch the object, means the expectation of tactile feedback can’t be broken. And another smart detail is preventing users from bumping into things that have been erased from the scene by making them reappear only if they get too close.

SEE ALSO
Spiders Are So Scary in VR That 'Dungeons of Eternity' Added an Option to Censor Them

What’s been shown in the work is visually imperfect and largely a proof-of-concept that relies on lots of manual, scene-specific work to make possible. But with the rapid advancements of computer vision and artificial intelligence, it’s not difficult to imagine how such a system could be built in an automatic and scene-independent way, and could be very convincing.

The authors have some surprisingly creative ideas for practical applications of this type of AR illusion, including “for health, we imagine as the user reaches out to a physical unhealthy chocolate bar, it morphs into something less appealing such as a spider. Or, the chocolate bar grows legs, runs away, morphs into a banana while running, and rephysicalizes at the location of a physical banana.”

So if, in the future, we find ourselves chasing down a magic chocolate bar only to find that it turned into a banana by the time we reach it—we’ll know just who to blame.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • MackRogers

    Meta labs is the king of PROOF OF CONCEPT.

    I wonder if the meta labs team goes out with Abrash on the weekends and they all laugh at how great life must be to tinker around with stuff all day knowing none of it will ever see the light of day.

    Living off Zuck’s teet.

    No wonder Carmack left, what a complete clown show it is over there.

    • Ninjatogo

      What do you mean none of it will see the light of day? The developed the hand tracking solution that exists in the Quest today. They’ve developed tons of open source software (LLaMa 2) that you can download as well. Their developments may not always make it to a product or reach the open source community but they are making useful tools for future developers

      • MackRogers

        I am talking about the Abrash led labs in context to VR. Like when they did that dog and pony show a year ago and showed high nit experimental headsets.

        Nothing ever trickled down, EVER. Quest Pro was supposed to be an incredible unit with years of Abrash’s teams R&D and the numerous outright acquisitions they made.

        The thing launched with near zero software support with slightly upgraded sweet spot optics. I doubt they even solved for those optics and just bought a team that solved it for them.

        Abrash is one of the worst human beings alive.

        • ViRGiN

          Apple denied your credit card or what?

        • Ninjatogo

          Abrash’s team is R&D, they build cool things and tell the business how much it costs. They don’t build final products or determine what makes it into one.

        • FN

          Did he pee in your cereal one time?

    • Anonymous

      Then Valve is King of Nothing to Release then?
      What a fucking stupid clown.

    • Blaexe

      A couple of things from reality labs have shipped. Hand tracking, inside out body tracking, AI legs, sound features, the pancake lenses – and probably a lot more.

      These “major breakthroughs” just need a longer time. Abrash once said that the things they develop end up in products around 5 years later at the earliest. Some things (if they ship) will take 10 years or more.

    • ViRGiN

      Yup, credit card denied – confirmed.

    • JACrazy

      Microsoft Labs is up there in terms of cool proof of concept prototypes that never go anywhere.

  • Xron

    Well, seems decent, aslong as we can get upgraded grapchics for it, collision, etc.

  • Christian Schildwaechter

    03:12 When deploying to the headset, our spatial computing and stereoscopic shading architecture automatically renders it for scene responsive mixed reality […]

    Neither the Apple nor the Meta marketing department can approve this scientific message.

  • Octogod

    Astonishing.

  • Traph

    That chocolate bar example is heavily reminiscent of those mid-2010s viral CGI videos imagining an AR future dystopia.

  • fcpw

    When people start hurting themselves sitting on phantom chairs- the lawsuits gonna fly.

  • Ardra Diva

    After playing with the MR features in my Quest 3 I feel like this is gonna be huge in so many different ways. Imagine watching a new star wars movie where the stormtroopers walk off the screen and into your room. totally possible now.