Watch: Latest Unity VR Editor Shown Off at Unite, Releasing in 2016

4560
15

During the opening keynote to their annual Unite conference yesterday, Unity announced that their in-development VR authoring tool EditorVR, which allows creators to step inside and work on their projects using virtual reality, is on its way within the next couple of months. Here’s a video of the full demonstration which showed off the latest version.

At this year’s annual Unite conference, Unity once again were out to make it clear they they and their hugely popular 3D game engine were ‘all about VR’. To that end, they used a section of the conference’s opening keynote to give us the latest look at the engine’s virtual reality enabled editor, which allows creators to dive into their projects and build their games and apps immersively from within VR.

SEE ALSO
Unity's VR Editor Lets You Create VR Content Like a God

The VR authoring tool, which the company refers to as ‘EditorVR’ (or EVR for short), was originally shown at last year’s Unite conference using the Oculus Rift headset and Oculus Touch motion controllers, and also made a guest appearance at GDC earlier this year. At Unite 2016, Principal Designer Timoni West once again demonstrated live on stage (accompanied by Principal Engineer Amir Ebrahimi)  Unity’s progress on EVR, this time using the HTC Vive.

unite-2016-editor-vr-2

West used a Unity project and assets from Campo Santos’ hit indie game Firewatch (no, there’s no VR version to announce as yet I’m afraid) to demonstrate how any project could be worked on from within VR. In fact, West made the point that non VR projects could benefit from the immersive editing experience, as it gives editors a feel for the scale of the worlds their building in real time.

SEE ALSO
Unity Raises $181 Million Series C in Anticipation of VR/AR Growth

There were many enhancements shown off in the demonstration and using the original Unite demonstration video as a comparison, the team have clearly focused hard on streamlining the user interface aspect, all the while stressing that everything you can do in the standard Unity, can be achieved in VR too. User interface panels looked much slicker than EVR’s initial showing and some impressive actions, such as physically dragging models between different, fully rendered virtual scenes in real time.

55279

Of course, it’s still difficult to gauge whether the benefits and obvious wow factor of working in VR will result in a more effective workflow overall. I suspect long time Unity veterans will take some persuading to toss aside actions within the traditional UI which will likely be committed to muscle memory at this point. Nevertheless, it remains an important commitment to VR as a platform and a glimpse at ways in which immersive technology may affect the way we work in the future.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


  • DiGiCT Ltd

    Much better as before,
    It is a nice foundation, will follow how it progresses, till now for me not much usefull for designing but very usefull for doing performance test with profiler and log in VR.

    Im used to design using maya and 3ds max, navigation is similar and also a keyboard types faster.
    VR seems to be easier and faster but it is actually not for a designer.
    Therefor this tool fits more level designers rather as artist themselves.

    • dogtato

      They said it supports both, so I think you can safely assume rift will work without an explicit demo.

      • DiGiCT Ltd

        Accuracy might be an issue, it might be you need to type more postion data to correct the offset as lighthouses are MM precise on location.

        • dogtato

          The human body is going to be the limiting factor. Even moving things with a mouse hits an accuracy limit where you need snapping and motion controllers make that worse because you don’t have a desk to brace against.

      • DiGiCT Ltd

        Accuracy might be an issue, it might be you need to type more postion data to correct the offset as lighthouses are MM precise on location.

  • ying li
  • VRgameDevGirl

    I CAN’T WAIT. Hopfully it will be free though… Just seems to good to be true/free.

    • NullReference

      It’s open source, so it must be free :)

  • Nein

    This seems like it will only be useful for very specific tasks. Positioning detailed props on tables seemed like a great example. But what is NOT intuitive and very slow is using the wii-style hover numpad and the Inspector menu filled with lots of tiny bars and buttons. That looked attrocious to work through. Not to mention they are seen through the far from perfect HMD resolution. Your arms will get tired much more quickly then good old fashioned keyboard too. Also will not have the precision and control of mouse and keyboard. Let alone map all sorts of Cntrl+functions to your Touch or Vive wands.

    • Szymon Ezekiel Labunski

      Fair perspective, but, the classic method of using Unity can be just as cumbersome to people who haven’t worked with it in the past. As someone with fresh eyes who’s coming to development, VR might seem like a more approachable method of making a world as you would want to see it IRL rather then the classic input/viewing methods which can be difficult to wrap your mind around.

      A decent example of this is language learning:
      its easier to learn to speak a language,
      then it is to read and write a language…
      sure its more precise and your able to express in further detail, but it can be scary as an outside to attempt to read/write a language you’ve never worked with before.

  • Like all VR demo’s, It is probably far better inside it than watching the 2D screen.

    Layout of complex scenes while you see from the users full 360 VR perspective is really great. You can always jump out to traditional 2D too.

    I love the fact Unity is really pushing the toolset more so than others and with the joint partnership with Vufuria it is just getting better.

    Also, I can certainly imagine an OS that is fully headset compatible and this allows you to build your own RAD GUI concepts. Full praise to Unity for making the E-VR Open Source.

  • Daniel Gochez

    I think for most purposes it would be better to input numbers by hold-trigger-dragg. More interactive and much faster. And maybe add a modifier key (like grip) so when you drag the change is tiny.

    • I think the only long term solution is voice recognition for textual input.

  • beestee

    It is going to take some time for this experience to be refined to the point that it would surpass traditional KB/Mouse/Monitors from a productivity standpoint. From a creative standpoint, though, this already seems to break through some barriers. WYSIWYG game engines are not that old yet, but they have made it significantly easier to jump into basic game development. I might be oversimplifying it some, but this just seems to be the next step towards democratization of game development.

    I do computer renderings of architectural structures for a living. My job exists because it is not easy for everyone to visualize the look of a built structure from black and white lines and notes/cross-references. Even with photographic 3d renderings, it is difficult/impossible to convey a true sense of scale and depth.

    VR has changed architectural rendering. You can now combine photographic quality with true immersion in a space, it breaks down barriers to perception. The dialog created by a VR presentation vs a traditional rendering changes significantly along with it. Real, meaningful changes can be made to a design from a human standpoint. These types of changes wouldn’t typically occur until after a structure was built and then subsequently remodeled, even then it is a rare occurrence due to costs and would typically never happen if it was only impacting the aesthetics of the space.

    It does seem that there needs to be a more efficient means of input for text/numbers/code, perhaps speech recognition could help here?

    • Totally agree. Speech recognition if you can not see your hands is the only way.