Epic Games, the company that makes Unreal Engine, recently released a substantial update to its MetaHuman character creation tool which for the first time allows developers to import scans of real people for use in real-time applications. The improvements glimpse a future where anyone can easily bring a realistic digital version of themselves into VR and the metaverse at large.

Epic’s MetaHuman tool is designed to make it easy for developers to create a wide variety of high quality 3D character models for use in real-time applications. The tool works like an advanced version of a ‘character customizer’ that you’d find in a modern videogame, except with a lot more control and fidelity.

A 3D character made with MetaHuman | Image courtesy Epic Games

On its initial release, developers were only able to start formulating their characters from a selection of preset faces, and then use tools from there to modify the character’s look to their taste. Naturally many experimented with trying to create their own likeness, or that of recognizable celebrities. Although MetaHuman character creation is lighting fast—compared to creating a comparable model manually from the ground up—achieving the likeness of a specific person remains challenging.

But now the latest release includes a new ‘Mesh to MetaHuman’ feature which allows developers to import face scans of real people (or 3D sculpts created in other software) and then have the system automatically generate a MetaHuman face based on the scan, including full rigging for animation.

There’s still some limitations, however. For one, hair, skin textures, and other details are not automatically generated; at this point the Mesh to MetaHuman feature is primarily focused on matching the overall topology of the head and segmenting it for realistic animations. Developers will still need to supply skin textures and do some additional work to match hair, facial hair, and eyes to the person they want to emulate.

The MetaHuman tool is still in early access and intended for developers of Unreal Engine. And while we’re not quite at the stage where anyone can simply snap a few photos of their head and generate a realistic digital version of themselves—it’s pretty clear that we’re heading in that direction.

– – — – –

However, if the goal is to create a completely believable avatar of ourselves for use in VR and the metaverse at large, there’s challenges still to be solved.

Simply generating a model that looks like you isn’t quite enough. You also need the model to move like you.

Every person has their own unique facial expressions and mannerisms which are easily identifiable by the people that know them well. Even if a face model is rigged for animation, unless it’s rigged in a way that’s specific to your expressions and able to draw from real examples of your expressions, a realistic avatar will never look quite like you when it’s in motion.

SEE ALSO
HTC Shaves $100 Off Its Flagship PC VR Headset, For a Limited Time

For people who don’t know you, that’s not too important because they don’t have a baseline of your expressions to draw from. But it would be important for your closest relationships, where even slight changes in a person’s usual facial expressions and mannerisms could implicate a range of conditions like being distracted, tired, or even drunk.

In an effort to address this specific challenge, Meta (not to be confused with Epic’s MetaHumans tool) has been working on its own system called Codec Avatars which aims to animate a realistic model of your face with completely believable animations that are unique to you—in real-time.

Perhaps in the future we’ll see a fusion of systems like MetaHumans and Codec Avatars; one to allow easy creation of a lifelike digital avatar and another to animate that avatar in a way that’s unique and believably you.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


  • Rudl Za Vedno

    Thanks, but no thanks. Let Evil Zuck scan his own freaky Data like body.

    • kontis

      This is Unreal, nothing to do with meta or Zuckerberg.
      Apple is million times more reliant on user data for profits than Epic Games and people call them privacy champions, LOL.

    • Max-Dmg

      Facebook most probably used the common term of ‘meta’ partly so they would fbe associated with tems that contain it and falsly claim credit.

    • ViRGiN

      we all trust in steamvr, with it’s rayman-recroom mix of avatars for their social steamvr homes right?

  • kontis

    For those who only read headlines and comments: it only imports shape, but not textures or hair.

    Also important to note in the context of VR: this is Unreal, so most of the cutting edge good stuff isn’t designed for VR or not even working. All the best materials and effects are only for the heavy deferred renderer, not for forward shading, which is optimal for VR.

    • Jerald Doerr

      Yeah, It’s just a Meta Human (import) plugin.. But I’d be surprised if Meta Human doesn’t work with VR right out of the box. I don’t think there’s much to it as it should just be a mesh with displacement targets. But I don’t use Unreal so I could be wrong.

  • Briann

    Although we were 13 years too early at Evolver, we solved the clone tech from photo as well as the animation rig problem with automatic export for bones or blendshapes for the face. Or I should say, Dr. Michel Fleury did!

  • Max-Dmg

    This would be good in hospitals for doing accurate facial reconstruction.

    • Lucidfeuer

      No it’s wouldn’t at all. It’s good for games but it’s not optimised for them yet.

  • Max-Dmg

    Are any games using this technology yet?

    • dk

      soon vr pron :P

  • Warscent

    The trailer is so woke. Beautiful .

  • Very cool. As you say, it may not be perfect, but it’s a good step forward

  • BananaBreadBoy

    Before the inevitable “WHY WOULD I WANNA BE MYSELF IN VR, I WANNA BE A 100 FOOT TALL NEON DRAGONKIN!!” comments come, a reminder that different apps and future social media will necessitate different avatars for different purposes. Being a busty anime catgirl with friends in VRchat is one thing, but not really the face you’d put on with coworkers or while visiting grandma.

    No different than LinkedIn having different culture to Twitter which has a different culture to FurAffinity.

  • BananaBreadBoy

    Before the inevitable “WHY WOULD I WANNA BE MYSELF IN VR, I WANNA BE A 100 FOOT TALL NEON DRAGONKIN!!” comments come, a reminder that different apps and social media will necessitate different avatars for different purposes. Being a busty anime catgirl in VRchat with friends is one thing, but not really the face you’d put on with coworkers or while visiting grandma.

    No different than LinkedIn having different culture to Twitter which has a different culture to FurAffinity.

    • dk

      u can also look like any other human

    • Tyme

      eww.. i use replica avatars of myself in most things.. that said.. fuck vrchat.. way to many yanks /vommit

  • David Hothersall

    I was part of a start-up in 2000 called bioVirtual that did exactly this. Humans from 2 or 4 photographs transposed onto an editable mesh with pre-set animations.
    We produced a product called 3DMeNow both pro and consumer editions. We sold a good few copies to big studios, we know it was used in the original Tiger Woods Golf but most customers wouldn’t reveal us why they were using it, I sold 5 copies to Konami alone!
    We had an importer to add our own characters to Unreal and it was such a weird thing to shoot a recognisable colleague in our lunchtime fragging sessions!!
    We also did some idents for BBC TV in the UK but most people didn’t seem to ‘get it’. I tried to interest Epic at the time and even spoke to Marc Rein but unfortunately the company (biovirtual) closed for reasons unrelated to the tech.
    Right idea just waaay too early.

  • Pietro Veragouth

    It was news that I had been waiting for for a long time and that in my opinion creates the trigger for a disruption. We will see…

  • Tyme

    either way he is stupid thinking this has anything to do with meta :’)