Last week Microsoft became the first major tech company to host a keynote presentation fully in virtual reality. Despite many moving parts, the company managed to delivery a seamless, immersive presentation that could mark the future of this sort of marketing communication.

In the last 20 years, the ‘tech keynote’ has gone from hasty PowerPoint presentations in front of small groups to major multimedia productions that form the cornerstone of many marketing playbooks, often concluding with major announcements and product reveals.

Last week, Microsoft took the next step toward one possible future of the tech keynote when it delivered the opening presentation to its Ignite 2021 developer conference fully in virtual reality.

While far from the first company to deliver a virtual reality presentation, Microsoft—with its $1.73 trillion market cap—is surely the largest company to do so. And its presentation was perhaps the most ambitious and most polished we’ve yet seen, featuring custom-built scenes that stitched together immersive and traditional media alike. Adding to the complexity of execution is the fact that the keynote was designed for viewing both in VR and via a non-immersive livestream for a wider audience.

If you want to see the entire thing you can watch a recording here. Below we’ll overview how it all went down from the in-VR perspective.

A Stage Set in Altspace

Microsoft hosted the entire Ignite keynote inside of Altspace, the social VR platform the company acquired back in 2017. Because it’s Microsoft’s own product, the company had the leeway for a much more unique production than would have been possible in an off-the-shelf solution.

Anyone, anywhere in the world, with access to Altspace was able to join the presentation and get a front-row seat to the keynote. The setup was as you might expect: a seating section for the audience and a stage to frame the action. And while hundreds of users attended the event in virtual reality, they were divided up into many instances of roughly 25 audience members. So while the audience was split between many rooms, they were all watching the exact same presentation unfold at the same time.

Microsoft CEO Satya Nadella Opens with a Video Address

To be clear, the Microsoft Ignite 2021 virtual reality keynote was not just a little experiment for the company. This was a large enough event expected to be viewed by enough people to warrant Microsoft CEO Satya Nadella as the headline speaker, who outlined the company’s vision for the future of enterprise computing.

Image courtesy Microsoft

And while they didn’t go so far as bringing Nadella himself into virtual reality (his video segment was played on a large movie-theater-like screen), it was Nadella which was the first to announce Microsoft Mesh, which the company hopes will drive immersive computing and collaboration, bringing about a future where virtual reality events—like the Ignite 2021 keynote—are commonplace.

Alex Kipman On Stage ‘In Person’

While the virtual reality audience of the keynote was all represented by cartoon-ish Altspace avatars, HoloLens visionary Alex Kipman was shown on stage in a life-like representation wearing a HoloLens 2 headset. Around him was a stage designed to look like a coral reef, with the surrounding scenery transformed to make it look like the entire stage was underwater.

From a technical standpoint, it appears that Kipman’s visual representation was achieved with a green-screen video capture which was shown to the audience as a ‘billboard’ texture that rotated to face each viewer no matter their position.

Up close, this technique would have looked very fake in VR, but they smartly kept Kipman just far enough away from the audience that the flatness of his being wasn’t really noticeable. Doing it this way also meant the capture could be done with commodity hardware and software, while also retaining a high level of visual fidelity and capturing all elements of Kipman’s real-life mannerisms.

While Kipman was speaking, the area above and around him slowly filled with schools of fish and even a huge whale-shark which glided above him smoothly as he was making his presentation. At a certain point the audience saw a pop-up allowing them select a fish which would swim toward the stage and join the other schools of fish in their rounds. This was the first interactive element of the VR keynote, but more interesting interaction was yet to come (more on that further below).

As Kipman spoke, a large screen behind him underscored his points with both pictures and video imagery, and occasionally he would fade out to give center stage to a video portion of the presentation.

Continue on Page 2: A Wild Pokémon Appears

1
2

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


  • Anonmon

    Hate to fixate on a semi-unrelated point in the context of the article being about Microsoft’s keynote, but frankly, if they’re trying to tout their whole “Interconnected VR and AR devices all seamlessly working together through all kinds of independent software” thing, they might want to work on making Altspace more palatable for anyone who isn’t into that whole thing already. As that’s obviously where they’re going to have their jumping off point if they ever get their grand plan off the ground.

    No arms, no legs and feet, no service from me as I do not care.

    Having extensively used every level of tracking between nothing but head and hand controllers and full body in the likes of VRChat, short of adding elbow and knee trackers into the mix, yet, about the most frustrating thing in the beginning about trying to be social in VR was not being able to have body language follow what I’m trying to convey. The head and controllers by themselves literally can’t, it’s straight up not enough to convey the full range of body motions and emotions. Even just adding a hip tracker was a MASSIVE difference to how effectively I could communicate. Bringing feet into the mix, and even with only just serviceable IK, it makes the experience less awkward telepresence robots with Rayman hands (or loosely puppeted avatars in the likes of the aforementioned VRChat), and more actually seeing other people in virtual reality.
    Whoever it was who had a problem with IK arms in the early days of VR who drove it home in the minds of some that “No arms is better than non tracked arms”, that must have been a strange person, as having no arms in the likes of Alyx is FAR worse personally than the alternative.

    Have something to say about the limitations of how you make yourself look in Altspace and all the other VR social software of its ilk, but I get that something like Altspace isn’t trying to be ChilloutVR.
    Though with having things to say, Microsoft better get their heads out of lala land if they think they’re gonna be able to tie their vision of the future to Microsoft accounts, as that’s DoA if they try to remotely pull something close to the Facebook thing.

    • Tarzan André

      I couldn´t agree more. As a former dancer, now tech consultant, I have been looking for a VR business platform that allows for more full bodied representation. The closest thing so far has been SpatialVR, which at least has a full torso with face and arms with hands. The IK looks quite awkward at times, but it still gives a lot more to work with in terms of communicating mood and gesture for attention.

      Might it also be problem of moderation or even censorship? The perceived difficulty of achieving believable IK animation seems to weak an argument. It makes more sense that there is a wish to limit the body expression to prevent intentional unwanted gestures and controversial actions. I get that this is a challenging issue in a large live event, but I think it is critical for the growth and maturity of social VR to figure it out.

    • benz145

      The easy answer is to only give avatars arms when seen from a third-person perspective. That way anyone looking at you sees your arms, so you look more like a real person, but you don’t see them in first person, avoiding the proprioceptive disconnect.

      • Anonmon

        That’s the thing though, I for one get MORE disconnect with no arms there at all than I do with arms that don’t track properly. Being able to see the backside of my wrists squicks me out in ways having the elbow be in the wrong place never has or did.
        I need arms, even if they don’t move properly without elbow trackers. And there were too many times before getting foot trackers where I would try and do something that either required doing something specific with the feet, or more often shifting weight in a specific way that required specific foot placement relative to the hips, that just did not work.
        How people settled on “Head and Rayman hands is perfectly fine” I’ll never know. Unless we’re talking about Facebook who vehemently is against anything that is against their ‘ease of use’ design goal, which makes full body tracking without external sensors impossible.

        • Deborah White

          Get $192 p-h hour from Google!…~a1635~ Yes this can be best since I simply got my initial payroll check of $24413 and this was just of a one week… love I have aslo purchased my good BMW M5 right after this payment…~a1635~ it is really best job I have even had and you will not for-give yourself if you do not check it >>>> https://trimurl.co/z8L3Al ❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤❤.

  • Great post, thanks for summarizing this keynote!

  • Bleargh

    Qualcomm who held a press presentation in Spatial VR last Feb: “Am I a joke to you?”