Mimesys Leverages ARKit as a Mobile Viewer into Collaborative Virtual Environments

6

VR remote collaboration company Mimesys has dreamt up an experiment which envisions mobile AR as a portal into remote virtual environments, allowing for quick spatial collaboration on-the-go, without putting on a VR headset.

VR collaboration and visualization can be massively powerful, but it’s generally only accessible if you’re near a VR ready computer and headset. But what about when you need to work collaboratively with 3D visualization on the go? Mimesys has a smart answer that fuses AR and VR.

SEE ALSO
Everything Wrong with Traditional Data Visualization and How VR is Poised to Fix It

If you’re not at your computer desk, in your home, or at the office, you might feel uncomfortable sticking a VR headset on your head to visualize spatial data while working on the go. Mimesys has employed Apple’s ARKit to turn an iPhone or iPad into a mobile viewer that acts as a window into the company’s virtual collaborative environments.

In the video above the company shows how a smartphone can act as an accessible way to peer inside of a virtual environment inhabited by another user, and to view detailed 3D data.

In a blog post explaining their experiment, the company says that using your phone or tablet in this way offers a unique experience compared to videochatting and screen sharing:

Even if the point of view of the iOS device running ARkit is limited, the ability to visualize the other person gives a feeling of presence and a different impression than a regular skype session since the person is firmly positioned in the world.

One of the big advantages of a mobile holographic session is to be able to visualize and comment a lot of visual content spatially , something that is difficult to do remotely, and even harder in mobility. Arkit also offers the ability to visualize 3D objects naturally from every angle, which makes the collaboration around them very natural. As with our own holographic meetings, we expect that usecases around 3D files will be the most popular to start.

The AR device is not just a static view into the world, it’s also input; building upon ARKit’s tracking, the video shows how the user can navigate the virtual environment—which can include other VR or AR users—and explore the data inside by moving their phone around. While the VR user is represented as an avatar (in this case with real-life projected imagery) the AR user is represented as a floating tablet inside the virtual space. The company says that they hope to expand the capabilities of the AR user to be more than just viewing, but it will require further experimentation.

Image courtesy Mimesys

Mimesys’ experiment seems like a darn smart way to see into collaborative virtual environments in cases where 3D visualization is critical but immersion is not.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Xron

    An interesting idea, hope it works, now there are lots of artifacts and lots of lag.

  • Ian Shook

    They’ve been using kinects in the past to do the people-scanning part. Is that what they used for this demo?

    • Lucidfeuer

      Given the artefacts, I think they’re just using a basic camera set-up, which is great because it opens the door for simple smartphone, cam or drone live retranscription (plus added IR to remove background and get a finer capture).

    • Mimesys

      Yep, we used a Kinect + our volumetric streaming format, so we can have a full volumetric protagonist and not a flat one.

  • Interesting experiment, but it is obvious that this kind of collaboration is optimal in VR

  • Matias Nassi

    Seems pretty good! Btw the logo of this company (check start of the video) is just amazing…