Last month Amazon quietly announced the ‘Wavelength’ platform as part of its Amazon Web Services (AWS) offering. The new ‘edge computing’ service promises “single-digit millisecond latencies” over 5G networks. Amazon says the platform is made for “latency-sensitive workloads” including AR/VR streaming, game streaming, IoT and more.

AWS is one of the most prevalent cloud computing platforms in the world, acting as the back-end web infrastructure for millions of customers.

In an announcement last month (which seems to have slipped under the radar of many of us in the XR space) Amazon revealed a new AWS service called Wavelength which is designed specifically for latency-sensitive applications served over 5G networks. Promising “single-digit millisecond latencies” to end-users, Amazon is engaging with major mobile carriers deploying 5G networks to locate AWS resources at the ‘edge’ of these networks to facilitate low latency for applications like cloud-rendered AR and VR content.

As we noted in our feature—Sifting Reality From Hype: What 5G Does (and Doesn’t) Mean for VR & AR—it is edge computing, not merely 5G, which is the key enabler for streaming real-time AR and VR applications from the cloud.

“Today, application traffic has to travel from a device to a cell tower to metro aggregation sites to regional aggregation sites and to the Internet before it can access resources running in AWS. These network hops can result in latencies of more than 100 milliseconds. This prevents developers from realizing the full potential of 5G to address low-latency use-cases”, Amazon wrote. “Wavelength addresses these problems by bringing AWS services to the edge of the 5G network, minimizing the latency to connect to an application from a mobile device. […] [Wavelength] allows developers to build the next generation of ultra-low latency applications using the familiar AWS services, APIs, and tools they already use today—eliminating the need for developers to negotiate for space and equipment with multiple telecommunications providers, and stitch together application deployment and operations through different management interfaces, before they can begin to deploy their applications.”

Amazon points to “emerging interactive applications like game streaming, virtual [and augmented] reality, and real-time rendering that require latencies of single-digit milliseconds to end-users,” as potential use-cases for AWS Wavelength.

At the outset, the company is engaging with carriers Verizon, Vodafone, SK Telecom, and KDDI to launch the service across in the US, Europe, South Korea, and Japan in 2020. Amazon says the service is presently undergoing pilot testing by customers using Verizon’s mobile edge compute (MEC) system.

As far applications using the AWS Wavelength service, Varjo, maker of high-end VR headsets to enterprise, is among the first to be announced. The company believes cloud rendering of AR and VR content is crucial to scaling the technology.

SEE ALSO
Now 150 People Strong, Varjo Talks Future Focus, More Affordable Headsets

“Now, instead of having to develop expensive local computing services that would be impossible to run on a battery-operated device, we can use edge computing to scale the rendering power and the business of our industrial-grade [headset] from thousands to hundreds of thousands of units,” said Varjo CEO Niko Eiden.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Rupert Jung

    So they basically place a computer directly next to a 5G antenna…?

    • Jistuce

      Presumably they also mount these antennas on every one of their warehouses.

      I still say it is ridiculous.

  • Jim P

    Full 5G does not go through walls

    • Lark R

      Outdoor VR! And make sure you are within half a block of a node or you’ll probably get compression artifacts.

      • Cythia

        I always income somewhere between 6,000-8,000 bucks almost every 30 days on the net. It is actually enough to freely substitute my earlier professions net earnings, particularly seeing that I just simply do the work around 20 hrs in one week at home. I have obviate my job after doing job for only one organisation for several months, I required very reliable source of income. Probably the foremost exciting part of doing work on the internet is that I’m always home with the cute kiddies and also spend tons of freetime along with my relatives on many beautiful beaches of the beautiful nations. Here’s the process to start ===> earn11keachmonth.vollidiot.net

    • mirak

      it doesn’t go through rain

  • ale bro

    It can’t be as bad as Google Stadia!

    But seriously, nobody believes single digit ms lag, that’s less than a wired dual shock.

    • ivan

      Stadia actually is the best streaming service in terms of latency.

      • Lark R

        Maybe, but even on google fiber its still not good enough for zoomed in headshots to feel good. I can only imagine VR would be orders of magnitude worse.

        • aasdfa

          you understands that fast paced first person shooters arent the only games that exist right?

          • Zantetsu

            That may be true, but all VR games need the best latency possible, and fast paced first person shooters are pretty good for exposing latency problems. So I think Lark R’s comment is very apropos here.

          • D-_-RAiL

            I have not had any noticeable latency issues using Stadia Pro yet. The graphics will buffer and artifact before any input latency slow down. I feel like people are just reading comments and articles from freelance writers that have a bias issues against Streaming services.

          • Zantetsu

            I am not sure you know what the term “buffer” means in this context, because buffering implies latency, which you are saying you do not experience with Stadia Pro.

            I also don’t know how you can tell the difference between input latency and output (graphics) latency, since the only observable effect of your input (a change in output) includes both latencies.

            Can you explain in more detail what you mean?

            Also … a google search for “stadia latency” shows that lots of people think that latency problems there are real. That makes sense given what we know about how computer networks work. I always take any claims of low latency with any network that passes though a router with a very large grain of salt.

          • aasdfa

            you understand tech doesnt evolve over night right? and even though these things are not perfect just yet the benefits theyll provide are big enough to start going in that direction. Its like ray-tracing, most dont see the true benefits of what raytrcing provides for the developer, they just see a mor expensive graphics card.

          • Zantetsu

            Yes, understand that, but I do not understand how your statement is in any way relevant to what I said.

          • Jistuce

            Latency and artifacting are caused by different issues with network streaming. Latency is caused by data transit time, while artifacting is caused by bandwidth available(or more accurately, the video quality being reduced to keep frame rates high in the available bandwidth).

            So image quality reduction has nothing to do with latency, except that both are caused by networks being imperfect.

          • JakeDunnegan

            Perhaps not, but they are the most popular.

          • Lark R

            Beat Saber is the most popular VR only game and Valve had to rework SteamVR to allow faster controller rotation than they thought humanly possible.

        • D-_-RAiL

          I feel like you are just referencing what someone else said. I have Stadia Pro and it’s actually pretty legit with no noticeable unput latency and the clarity is actually really good but its more dependent from game to game.

          • Lark R

            That’s probably because other people had similar experinces. I got a buddy code or whatever they call it so I’ve used stadia on my gaming PC with a 144hz monitor, wired cat5E to the google fiber box and it is not good enough latency in the most extreme situations. Also most of the times that I tried it was very early in the morning on the weekend, like 6-7am CST so well away from any possible peak time.

            Granted just running around and platforming and non zoomed in and not trying to snap to targets it felt great.

        • mirak

          It can be fixed partly with prediction.

      • NooYawker

        It’s the best because it’s the only one.

    • Ted Joseph

      I have been a gamer for decades. Have purchased almost every gen of Xbox and Sony since they came out. I have the Xbox One X, and PS4 Pro. Couldnt wait until next Xmas to purchase the Xbox Series X, and PS5… That was until Stadia came out. I am sold on cloud gaming. So far Google has the best on the market. If another company beats them overall, I will change.. Regardless, Xbox One X and PS4 Pro will be my last consoles…

  • Miqa

    Is that really good enough though? I think 20 ms total time per frame is what is often quoted as requirement for VR, no? Single digit could be up to 9 ms, which isn’t that impressive. For remote rendering, if they can’t promise 2 ms, is it really useful for VR?

    • asshat

      yes and thats alllll the lag because youre not doing any more computing on your end all it needs it there and back and done. So its pretty fast and cloud computing so that you dont have to wear a huge pc on your face or be plugged into one is the future.

      • Miqa

        It still needs input, then render remotely and then send the frame. So l’m not so sure.

      • Jistuce

        That’s not “alllll the lag”. It is the time it takes a compressed video frame to get from the rendering computer to your display unit. One slice of the lag.

        There’s also latency involved in:
        collecting input on the user’s side,

        transmitting input back to the rendering computer(this should be symmetrical to the latency from downloading a compressed frame , so we’ll say 9ms),

        decoding the compressed frame on the user’s side,

        sending it from the decoder to the display panel,

        display panel response time, and
        latency inside in the rendering computer(which is hopefully similar to local compute+MPEG encoder, but stands a chance of being appreciably worse).

        Basically, we’ve found a way to add “up to 9 ms” latency to input, and again to output. And add extra steps to the pipeline, each step incurring additional latency.

  • D-_-RAiL

    Now it’s looking what I have been preaching for several years now is coming true. This is when VR will really take off when you can Stream games to VR standalones near flawlessly. It’s still several years off but its a huge driving factor that will push VR forward in a major way.

    • adf

      100% agree and cant believe all the negative views toward cloud computing.

      • JakeDunnegan

        Well, because most people have invested thousands into their gaming rigs, and cloud computing adds nothing to that. Most of the games offered have been purchased already, and they’d have to buy them again.

        You saw how much gamer angst their’s been about the Epic game store? And I can only imagine you can multiply that times whatever when you think yet another game store you have to buy from and you won’t even have the game on your machine, but in the cloud, which is a paradigm shift in how people think of their games.

        Also keep in mind how crazy people went when Xbox said you’d have to be connected to the network/cloud with the Xbox 1 – they basically LOST the generational war with PS4 b/c of that one effup. And somehow cloud computing is going to be this panacea? I don’t see it.

        • StarLightPL

          As long as companies forcing a paradigm shift on us don’t shift THEIR thinking that I need to “buy” games on their service they can kiss me and my gaming rig goodbye. Lag aside, the only acceptable agreement for me in this unfavourable relationship is me buying streaming subscription akin to netflix and having UNLIMITED access to games. Like in Origin Access or XBox Game Pass. Otherwise, eff them and their greedy service. My rig can handle games just fine, and with new NVidia 7nm coming out this year it will be even more futureproof.

        • mirak

          what gamer angst ?

          • JakeDunnegan

            Well, I suppose you could google “gamer against about the Epic game store” but, sure, why not, I’ll do it for you.

            “IS EPIC GAMES STORE EXCLUSIVITY WORTH THE RISK? DEVELOPERS SHARE THEIR STORIES” – Newsweek.

            “If you listened to the war cries of YouTube reactionaries or members of the mobilized Twitter mob, you’d think the Epic Games Store was bad for the industry and gamers alike.” Yada yada.

            The creator of ‘Fortnite’ is trying to shake up the PC gaming industry — here’s why a lot of folks are furious about it – Business Insider.

            “When the small indie team behind an upcoming game named “Ooblets” announced plans to exclusively lauch on the Epic Game Store, people were furious.

            But to Rebecca Cordingley and Ben Wasser, the married couple behind “Ooblets,” the arrangement was hugely important. By signing with Epic, the “Fortnite” publisher guaranteed up front to make the game a financial success — a hugely important goal to reach for a small indie team.

            But instead of support from excited fans of the upcoming game, the couple received death threats and horrific insults. “When this is all said and done, and your game and career are in shambles, I hope your wife leaves you. Based on her posts though, you guys are a perfect pair of ****heads,” one Reddit user wrote.”

            So. Yeah, it goes on an on, I’m sure a few seconds of your time may yield up even better results. And, I might add, it’s pretty stunning how whiney gamers are.

            Just saying.

      • kontis

        And I, on the other hand, am very disappointed that people seem to not care much about OWNERSHIP of computation.

        Once we only have terminals it’s an Orwellian dystopia. It’s game over to freedom. Buy virtual clothes from Zuckerberg or get a $99 subscription to a digital surgery to be a bit prettier in AR for a month.

        Forget about modifying anything on your own. Unless you are a dev with a workstation that costs more than a car.

  • uKER

    They’ll just start specifying it in seconds.

  • Jovah

    Why is no one caring or talking about the rapid brain cell degradation that occurs when humans are around 5g tech? You seriously want to wear a headset that streams 5g electromagnetic radiation through your brain?
    Just another new wave of intelligence culling without safety features implemented.

    • kontis

      There is a 100000000000x more powerful source of the same electromagnetic radiation destroying your body since you were born. Search for “sun” on Wikipedia. it’s terrifying.

  • If the claims are true, this can be very interesting for whoever wants to experiment with AR and VR via 5G

  • https://www.caesarvr.com/…/what-is-the-future-of-reality-go…

    Hi friends, uploading an article that took me a really long time to do, I invested a lot of imagination.

    It’s time for BIG LIKE