VR remote collaboration company Mimesys has dreamt up an experiment which envisions mobile AR as a portal into remote virtual environments, allowing for quick spatial collaboration on-the-go, without putting on a VR headset.
VR collaboration and visualization can be massively powerful, but it’s generally only accessible if you’re near a VR ready computer and headset. But what about when you need to work collaboratively with 3D visualization on the go? Mimesys has a smart answer that fuses AR and VR.
If you’re not at your computer desk, in your home, or at the office, you might feel uncomfortable sticking a VR headset on your head to visualize spatial data while working on the go. Mimesys has employed Apple’s ARKit to turn an iPhone or iPad into a mobile viewer that acts as a window into the company’s virtual collaborative environments.
In the video above the company shows how a smartphone can act as an accessible way to peer inside of a virtual environment inhabited by another user, and to view detailed 3D data.
In a blog post explaining their experiment, the company says that using your phone or tablet in this way offers a unique experience compared to videochatting and screen sharing:
Even if the point of view of the iOS device running ARkit is limited, the ability to visualize the other person gives a feeling of presence and a different impression than a regular skype session since the person is firmly positioned in the world.
One of the big advantages of a mobile holographic session is to be able to visualize and comment a lot of visual content spatially , something that is difficult to do remotely, and even harder in mobility. Arkit also offers the ability to visualize 3D objects naturally from every angle, which makes the collaboration around them very natural. As with our own holographic meetings, we expect that usecases around 3D files will be the most popular to start.
The AR device is not just a static view into the world, it’s also input; building upon ARKit’s tracking, the video shows how the user can navigate the virtual environment—which can include other VR or AR users—and explore the data inside by moving their phone around. While the VR user is represented as an avatar (in this case with real-life projected imagery) the AR user is represented as a floating tablet inside the virtual space. The company says that they hope to expand the capabilities of the AR user to be more than just viewing, but it will require further experimentation.
Mimesys’ experiment seems like a darn smart way to see into collaborative virtual environments in cases where 3D visualization is critical but immersion is not.