Is The Metaverse The Future Of VR?

By Mike Reiss- May 01, 2023 31

Ever since Facebook changed its name to Meta in 2009, there has been much buzz about the Metaverse. There is also renewed interest in virtual worlds. While the concept of virtual reality (VR) has been around for a long time, the technology is only now really starting to take off.

 

Defining Metaverse and Virtual Reality

The first is the metaverse, a concept that has been a regular feature of science fiction since 1984 when William Gibson wrote his novel Neuromancer.

 

VR has been a part of popular culture for decades, appearing in movies like The Matrix, Tron, and Ready Player One. Following the (sometimes unsuccessful) launches of consumer devices like the Oculus Rift, PlayStation VR, Valve Index, HoloLens, and even Google Glass, the technology has steadily moved out of science fiction and into commercial reality. But there is still a long way to go.

 

This article discusses virtual reality and augmented reality as a group consisting of three main categories. Fully immersive virtual reality at one end, augmented reality (AR) at the other, and so-called "fused reality" (MR) in between.

 

This spectrum can be thought of as a broad category called "Augmented Reality (XR)". It includes the three categories defined above as well as enabling technologies such as haptics and spatial audio.

 

In the future, XR augmented reality may include brain-computer interfaces, smell and temperature response, and possibly even taste. These futuristic ideas have yet to materialize for a variety of reasons. But mostly because the device still needs a lot of R&D work It is not clear how the data on the sensitive interface is. But we have devices and data for AR/VR, haptics, and spatial audio at this stage, so they're moving forward.

 

We're often asked, "Why hasn't augmented reality taken off? Why isn't XR everywhere?" limitations

 

The Limits of Extended Reality

For AR, glasses are heavy and unwieldy and come in basically one style. Remember Google Glass or Snapchat Glasses? Great if you like this style. Otherwise, you probably won't wear them, no matter how cool the technology is. People want a variety of stylistic options, so to be truly versatile, technology must be compatible with a variety of options.

 

As for VR headsets, the simple truth is that most people don't want to wear them for long periods of time. They are heavy and hot, so you get hot and sweaty, which makes you uncomfortable.

 

But they're good for short periods of time, like jumping out of a plane or free diving with great white sharks. But these aren't devices most people use to watch feature films or play three-hour video games. When talking about AR or mixed reality devices, they can be big. For example, you'll never see most people wearing HoloLens in public. But that may change as devices get smaller and more comfortable.

 

Future mixed reality/converged reality devices will require a wider area for display with more features and more advanced viewing for AR applications. Achieving this will require better cameras, infrared (IR) cameras, or other sensors to accurately map space, improving the overall quality of the experience. Device manufacturers are aware of these challenges and are already working on solutions

 

Creating virtual worlds requires removing data processing from hardware devices

What does the virtual/augmented/merged reality world actually look like, no matter what device the user is using? Is it AR, overlaying different skins in real-world environments, making modern cities look medieval, or changing people's clothing? Or are we talking about a virtual representation of the actual real world, like a digital twin of your city?

 

And even more fantastic: fully immersive virtual environments that don't exist in the real world. There's a lot of computing going on, and the devices themselves are too small to hold all the processing power needed to render these experiences.

 

To be able to handle the functions required to make glasses and headsets smaller, lighter, and more portable, mobile networks must improve. To make devices smaller, extend battery life, and generate less heat, we need to offload processing power to the edge of the network. This has to be done so that the latency is at or below the 20ms threshold because in VR people feel sick beyond 20ms latency. Some advanced AR applications where the device tracks and recognizes fast-moving objects will require low latency, up to the 5 ms range.

 

Over time, we'll see more computing being done in the headset itself. To move devices, we need 5G (and 6G) networks with network throughput, edge computing, and the ability to handle latency. We need low latency, low jitter, high bandwidth, and an ultra-reliable transport network without packet loss. We're getting there, but the web can't do it yet.

 

Offload graphics processing and rendering techniques

We need more powerful networks, not only because the need to shrink devices increases edge computing requirements, but virtual worlds require a lot of graphics processing and rendering. This rendering needs to be done at the edge, the rendered world is passed back to the device and worn in near real-time.

 

Moving graphics processing and rendering to the edge opens the door for devices to become smaller and lighter. But it lays the groundwork for new innovations in complex rendering that can be done remotely and back to the device. It's one thing to remotely render a relatively linear virtual world like a video game, but quite another to deliver a live experience in real time.

 

Some devices have experimented with different models of offloaded computing power. The Valve Index is a VR device that connects to a high-powered computer via a wired connection, primarily for gaming.

 

Then there's a company called Nreal that offers a set of AR glasses that use a wired connection to harness the processing power of a smartphone. Although both of these examples use wired connections, they are both pushing us toward applications, devices, and virtual worlds that can be accessed, processed, and rendered over wireless networks.

 

There is also a technology called SideLink. It is being standardized in 3GPP to allow certain cellular devices to communicate with each other without going through the core network. This has the potential to be very useful for VR and AR rendering. These innovations gave rise to the possibility of glasses-like devices that would one day replace cell phones.

 

Interoperability is key

Will Facebook/Meta "Own" the Metaverse? They would have a virtual world, they might call it a metaverse, but they wouldn't have all of the metaverses like today's internet. A metaverse is a collection of virtual worlds that we can see. It's much like the Internet, with an infinite number of sites available for every imaginable purpose. Some parts of the metaverse may be digital twins of the real world, some parts may be a unified version of the real world and the virtual world, and other parts may remain entirely virtual.

 

Metaverse will eventually become decentralized and device independent. And, like the Internet, it needs a set of standards, protocols, and common APIs to function properly and be highly interoperable. Once that's done, users will be able to access Facebook's Metaverse over a 5G (or 6G) network using smart devices like phones, just like you can access Google's virtual world through a Sony device over AT&T's network.

 

If devices and the world remain largely proprietary as they are today, growth potential will be limited. Interoperability standards will be as essential to Metaverse as they are to MPEG video compression and 3GPP for cellular communications. In the virtual world, you can enter different areas regardless of the provider you use to access it. And each business will have its own brand-specific experience in the virtual world, just as they do in the real world.

 

To provide the highest quality experience to the greatest number of users, device and network interoperability is critical and must be standardized. Once such a standard is created, no one company owns it, just as no one company owns 3GPP or MPEG.

 

What would the metaverse look like?

So, once we get there, how will the augmented reality be used? We expect gaming to remain an important driver, as it is today. But there are many other ways that we can see this technology take shape.

 

If we could design a virtual sports bar go from the cockpit to the pit lane or to the stands. What if you could simulate diving with sharks, go skydiving, or visit a world-class museum? The possibilities of the metaverse seem limitless.

 

We're probably 15 to 20 years or more away from a truly standardized, open metaverse. Meanwhile, we'll see many companies experimenting with their own metaverse, such as Facebook's big-M metaverse. But will Facebook/Meta last? Of course not. Facebook may have a "branded" metaverse, but there will be many metaverses to explore and enjoy.

 

If necessary, you are welcome to contact us!