Soon you will be able to aim your smartphone at a restaurant window, and the available reservation times will instantly pop up on your screen. If you don a pair of 5G-enabled glasses, you will be able to see the weekly promotions at your local supermarket through your lenses before you set foot in the store. This is the world augmented reality (AR) promises to deliver.
But despite the boom of software development kits such as Apple’s ARKit, Google’s ARCore, Amazon Sumerian, and the continued development of Microsoft’s Mixed Reality ecosystem, the uptake of consumer AR has been slow.
One reason for the slow uptake is that the current infrastructure is not robust enough to support the mass adoption of AR. Pokémon Go — the AR game that launched to mass hysteria in summer 2016 — showed the limitations of the current cloud infrastructure for AR. When the search for rare Pokémon drove huge crowds to one location, it strained network bandwidth and caused slowdowns and outages. (The inaugural Pokémon Go Fest, which gathered 20,000 users in Chicago, suffered from exactly these problems.)
That is why John Hanke, CEO of Niantic, the group behind the game, is investingin the AR Cloud — a 3D virtual map that is overlaid on the real world, where information and experiences are tied to specific physical locations. That means users can visually search the internet in real time. It’s a bit like Google searching the world around you just by looking at it.
However, an AR Cloud needs the right infrastructure to support it — and right now, how that infrastructure will get built is an open question. Some experts predict that the AR Cloud will represent a bigger shift in technology than the introduction of smartphones. One thing is certain: In order to deliver this AR-enabled future through the AR Cloud, we will need to dramatically rethink how the internet is built.
Cloud computing isn’t ready for the AR Cloud
Over the years, the internet has shifted from being mostly text to mostly image to mostly video. All of those pieces of media could be created and shared anywhere on the planet. That’s why the cloud infrastructure was built up through a combination of cloud storage solutions like Amazon’s AWS and content delivery networks (CDNs) — networks of data centers that quickly and reliably distribute content from a central location to endpoints around the globe. But that kind of centralization doesn’t make sense for latency-sensitive, localized AR applications.
AR apps also rely on precise localization, but current GPS-enabled solutions are not accurate enough to ensure that a menu shows up in the restaurant window rather than 10 feet down the road. AR Cloud creates a “map of the real world,” meaning that it can provide more accurate mapping visualization.
Because this 3D virtual world created by the AR Cloud overlays the real world, where content is anchored to a physical location, information needs to be accessible in real time; even a delay of a few milliseconds can negatively impact the service. If the AR menu pops up after you have already walked past the restaurant, what’s the point?
AR is both created and consumed locally, so it follows that AR content should be stored close to its physical location to minimize latency, rather than an AR headset having to reach all the way to some server far away to retrieve the data it needs to render the information. This will be supported by 5G, a high-bandwidth, low-latency network that will push the compute power to the network edge.
Therefore, if AR is to deliver on its promises, the infrastructure of the internet has to change.
The future technology leaders?
But who will lead in the AR Cloud infrastructure space? Google, Facebook, and Microsoft have made big investments in the current, centralized cloud infrastructure but are adapting to this new structure. Amazon’s software routes time-sensitive data that it collects from its smart devices, such as the Echo smart speaker, for local processing to decrease latency. Microsoft has similar software in its smart devices. Facebook’s Telecom Infra Project (TIP) has an Edge Computing Project Group that is researching ways to implement services and applications at the network edge.
But startups have begun to lead the charge, building small data hubs on the network edge in the hopes of bringing content storage closer to end users rather than just speeding it up.
U.S.-based startup EdgeMicro Corp partnered with an unnamed mobile operator to deploy up to 30 micro data centers to reach users across the USA. The organization claims that it can “deliver zero latency at scale” with this delivery method.
Vapor IO Inc. is developing the Kinect Edge, a network of small data centers that can be placed at the base of cell towers. They are integrated into the 5G network to create a “single virtual facility” for the lowest latency possible.
This marriage between edge computing and 5G is already happening in the gaming world. AT&T Foundry launched an edge computing test zone in Palo Alto, California, a site several miles wide where developers, startups and companies can test their AR, VR, and cloud-driven gaming apps under AR Cloud conditions. The first experiment with GridRaster, a mobile AR/VR startup, has already yielded interesting results for optimizing application performance in this new ecosystem, including the need to simplify some capture and rendering functions. This test zone also enables AT&T to explore use cases for edge computing, acting as a proof of concept for delivering wider AR applications, telemedicine and self-driving cars.
Though AT&T’s experiments are ongoing, it is still not entirely clear who will build the physical infrastructure to support the AR Cloud: Will it be a startup or a telecom? There is no way of knowing; only time will tell. But until that infrastructure is built, don’t expect to see your local grocery store specials in your glasses lenses anytime soon.