Oculus is today launching their ‘Expressive Avatars’ update on Rift and mobile VR; it’s a significant step up in realism thanks to a few additions including cleverly simulated eye models, lipsync, and even microexpressions. If you’re also hurting for more hairstyles, clothes, and accessories, you might want to pop into your headset at some point today to check it all out once the update is live.

First unveiled at OC5 last year, the public release of Oculus’ avatar overhaul launches today which also includes an update to the Avatar Editor on PC and mobile. The new editor includes a range of new customization options such as lipstick, eye color, brow and lash colors, new hair, clothing and eye wear options.

Just like Oculus’ previous avatar system, third-party apps and games will also support the update. Over the course of a few days Oculus tells us games such as Poker Stars VR, Tribe XR DJ, and Epic Rollercoaster will all include support. While not stated specifically, it’s clear the company is hoping to appeal to more third-party developers with ‘Expressive Avatars’, as many games on the platform make use of their own avatar systems.

Oculus is set to release their blogpost that officially announces Expressive Avatars at some point today.

Express Yourself

Oculus first released their first version of Oculus Avatars in 2016, and while the company has since given users the chance to customize their persistent digital likenesses with a variety of textures, clothing, and hairstyles, without eye or face tracking avatars were essentially inarticulate masks that made the user rely upon motion controls and body language to transmit emotion.

Oculus previously used eye wear to avoid off-putting stares, Image courtesy Oculus

This was due to the fact that no Oculus devices actually feature face or eye-tracking, which would naturally give avatars a greater avenue for 1:1 user expression. And with the impending release of Oculus Quest and Rift S, that’s still going to be the case, as neither headset offers these things. Hardware notwithstanding, Oculus has been hacking away at just what they can get away with in order to better simulate realistic-looking eye movement, blinking, facial movements, lipsyncing—all of it in the name of making avatars more human.

SEE ALSO
Oculus Brings More Lifelike Sound Propagation to Audio SDK 1.34

“We’ve made a big step forward with this update,” says Oculus Avatars product manager Mike Howard. “Bringing together expertise in art, machine learning and behavioural modeling, the Oculus Avatar team developed algorithms to accurately simulate how people talk to, and look at, objects and other people—all without any cameras to track eye or face movement. The Oculus Avatar team were able to codify these models of behavior for VR, and then had the fun job of tuning them to make interactions feel more lifelike.”

Keeping It Real

Oculus’ Mike Howard penned a deep-dive article on the past, present and future of Oculus Avatars, which tells us a little more about what sort of challenges the company faced in creating not only more realistic avatars with its current hardware in mind—limited by the lack of on-board biometric tracking and user’s computers/mobile headsets—but doing it well within the bounds of the uncanny valley.

That’s something you can’t afford to brush up against if you want users to invest both the time into creating their digital likenesses and interacting with others, Howard maintains.

“In VR, when you see an avatar moving in a realistic and very human way, your mind begins to analyze it, and you see what’s wrong. You could almost think of this as an evolutionary defense mechanism. We should be wary of things that move like humans but don’t behave like us,” he says.

Making an avatar that simply moves its mouth when you talk and blink at regular intervals wasn’t enough for Oculus. The system needed to be tailor-made to infer when a user might conceivably blink, and make the best possible guess at how a user’s mouth should move when forming words. That last part is a particularly tough equation, as humans move their mouths before, during, and after producing a word, leaving the predictive capabilities with a hard ceiling of accuracy. More on that in a bit.

As for eyeballs, the realization that VR headset users typically only move their eyes about 10 degrees off-center, and use their head to accommodate the rest of the way to look at any given object, made it “easier to predict where someone was looking based on head direction and the objects or people in front of them in a virtual environment, giving us more confidence in being able to simulate compelling eye behaviors,” Howard maintains.

The system is said to simulate blinking, and a host of eye kinematics such as gaze shifting, saccades (rotating the eye rapidly usually during focal change), micro-saccades, and smoothly tracking objects with the eye.

And for simulated lip movements, the team discovered they could model intermediate mouth shapes between each sound and the following sound by controlling how quickly individual (virtual) mouth muscles could move, something the team dubs ‘differential interpolation’.

To boot, the team has also included micro-expressions to keep faces looking natural during speech and rest, although they’re clearly staying away from actual implied expressions like extremely happy, sad, angry, etc. An avatar looking bored or disgusted during a lively chat could cross wires socially.

What Oculus Avatars *won’t* do, Image courtesy Oculus

In the end, Howard makes it clear that more realistic-looking avatars are technically in the purview of current head and hand tracking hardware, although compute power across all platforms puts a hard barrier on the sort of skin and hair that can be simulated. Frankly put: a more detailed skin texture means you have to model that skin to look natural as it stretches over your face. Having more detailed skin also necessitates equally detailed hair to match.

SEE ALSO
Oculus Debuts 5K × 5K Mobile VR Playback in 'Henry', Now Available on Go & Gear VR

“Given our learning to date, we determined that we would use a more sculpturally accurate form, but we’d also use texture and shading to pull it back from being too realistic, in order to match the behavioral fidelity that we were increasingly confident we could simulate,” Howard explains. “Our goal was to create something that was human enough that you’d read into the physiological traits and face behaviors we wanted to exemplify, but not so much that you’d fixate on the way that the skin should wrinkle and stretch, or the behavior of hair (which is incredibly difficult to simulate).”

There’s still plenty left to do. Oculus Avatars aren’t seamlessly available in all games on either the Oculus platform or Steam, requiring developers to integrate on a case-by-case basis. Not to be missed: they’re still basically floating torsos and hands at the moment. To that tune, the company is working on inverse kinematic models to make full body avatars a possibility.

If you want to read more about the history and possible future of Oculus Avatars, check out Howard’s deep-dive when it goes live later today.

Update (12:15 ET): In a previous version of this article, it was stated that Oculus Avatars aren’t cross-platform, however this isn’t accurate. Oculus made a cross-plaform option available to developers last year, although this integration must be done on a game-by-game basis. Developers can choose to use default Oculus avatars or allow unique Oculus platform user avatars in their game, although it’s far from the seamless integration that the word ‘cross-platform’ might imply. We’ve updated the offending bit to better reflect this distinction.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


  • Xron

    Hope that gen 2 hardware will let us use even more lifelike avatars, but progress is always welcome!

  • Radiation

    “Oculus Avatars aren’t cross-platform”.. but they have been for over an half a year at least from a developer perspective.

    • Hi Radiation. Thanks for pointing that out. I’ve updated the piece with more accurate info regarding cross-platform integration.

  • DoctorMemory

    I am still in the camp of, “If I wanted lumpy guy in the mirror I would stay in actual reality”. VR for me is about getting out of reality and having options I could not otherwise get. I still see nothing wrong with the original avatars. If I want to be made of stars or code from the Matrix let me have that option. Then provide an easy way in the SDK for developers to present me with a choice of re-skinning or choosing a more experience appropriate avatar.

    • care package

      More life like avatars are a natural progression of VR

  • Rogue Transfer

    A little bit too wax-like – esp. the lack of knuckles(pardon the pun) and no texture on the hands. With the hands being the most important aspect for the user(you rarely see your own avatar face in VR), they need to be good.

    The expressions aren’t too bad, though I really think Rec Room has a great system for detecting & showing extremes of annoyance/happiness at appropriate moments. Even with their much more cartoony-style, their clever emotion code is superb.

    • DoctorMemory

      The wax-like part really creeps me out. I think one of the reasons I dislike the “realistic” avatars is that when I choose the ones that matches my race/gender I get a big uncanny valley feeling off of them when I look in the mirror. I would much rather have definitely not real over really close but not quite real.

      • aline

        A trip to the most exciting places in the entire world with all your family members and a lovely residential home you would like to buy. Is your present job truly able to fulfilling your goals? If your answer is no then it truly is time for you to switch. We bring to you a web-based job that is certainly as simple as being on any search engine like google or doing copy paste work. It really does not require any technical knowledge and it does not require anything to sell. It is certainly not similar to any scams that promises to make you “rich over night” and then ended up being pyramid schemes or stuff where you need to sell to your friends and family. You can easily start and you will get the instruction guide within few weeks. You can make nearly $15,000-$16,000 each month. You could invest much more time with my family members and can go out for wonderful trips. This job offers you opportunity to be own own boss and can work from any location. No need to wait for too long, Go and check it out this phenomenal online job opportunity.>>>>>>>>>>>> CHECK OUT

      • Lisa

        A getaway to the most fascinating destinations in the entire world with the ones you love and a beautiful house you aspire to to buy. Is your existing job really able to fulfilling your aspirations? If your answer is no then it’s time for you to switch. We bring to you a web-based job which is basically as easy as being on any search engine like google or doing copy paste work. It does not require any technical knowledge and it does not require anything to sell. It surely is not similar to any internet scams that promises to make you “rich over night” and then ended up being pyramid schemes or stuff where you have to sell to your friends and family. It is easy to start and you will receive the instruction guide within few weeks. You possibly can make around $30,000-$40,000 monthly. You are able to spend more time with my family and friends and can go out for incredible vacation trips. This job gives you opportunity to be own own boss and can work from anywhere in the world. Do not wait for too long, Go and check it out this excellent online job opportunity.>>>>>>>>>>>> http://kmimicro.party/kAlFPF

  • Matilde Constance

    Ridiculous!

  • Amazing, can’t wait to try it!

  • Les Vega

    this is absolutely horrifying, seriously has nobody had the Uncanny Valley talk or were these all made to look like Mark Zuckerberg?

    • Tags I812

      lol

  • iThinkMyCatIsAFlea

    Look at how inclusive they’re trying to be.

    Remember that time when Oculus’ co-founder, Palmer Luckey, financed that alt-right group, Nimble America, that Milo Yiannopoulos was involved with? And Oculus and Facebook stood by Palmer.

    And Palmer supported Trump. He even donated $100,000 to Trump’s inauguration.

    Google Oculus Trump.

    Don’t support Oculus.

    • CarlosTSG

      Seriously go and read “The history of the Future” by Blake Harris then you’ll understand what Zuckerberg and Facebook is really like.

      • iThinkMyCatIsAFlea

        I know what Facebook/Oculus are really like.

    • plrr

      Don’t decide for people what politicians are okay to support, or which ethnicitys deserve respect. It is bad behavior.

    • Tags I812

      man you been drinking to much of that coolaid.

  • Tags I812

    This Great. ive been waiting for this from the beginning.

  • gigus

    I am not sure that I want them to infer my expressions, eye blinks seam OK, but I am a bit concerned about having my responses misinterpreted. For instance a buddy of mine irl was in a bar and it was so loud he couldn’t hear what they guy next to him was saying so he just continued to smile, laugh and nod his head, he almost got his ass kicked, the guy next to him was telling him about his wife’s infidelity. Sometimes the appropriate response is sadness, sympathy, boredom or anger. If I had confidence they were detecting the correct emotion or accurately sensing your real expressions I would feel better about it. I am also not a big fan of the new avatars. I preferred the older more simple abstract avatars, I had one I liked but since they updated haven’t been able to get it back. I do think the “busts w/ hands” approach is really good though.

  • sfmike

    Strange generic head shapes are a real turn off. Male characters end up all looking like trans-sexuals.