Metaverse Musings

For Starters…

Recently, it seems every podcast we listen to and article we read is consumed by one topic – the Metaverse. Framed as digital, shared universes that will usher in a new era of humanity, the Metaverse has become 2022’s biggest buzzword. But what exactly is the Metaverse? And how will it impact healthcare? Before we can answer those questions, we must ground ourselves in fact.

#1 Human perception of reality is dependent upon what we experience – what we touch, see, hear, taste, and smell. This perception goes on to inform our cognitive and affective states, intentions, and, ultimately, our behavior.

#2 Researchers at MIT suggest the brain can process an image in 13 milliseconds. That means the fastest way for humans to acquire new knowledge and communicate with one another is through imagery.

#3 Roughly 90% of the world's data was created in the last two years alone. Unless we upgrade the way our information is presented and consumed, the oceans of real-time data we’re building will be useless. Failing to visualize that data means completely inefficient human decision-making.

What is the Metaverse?

The “Metaverse” is not a place. It nods to the realization that the internet and the technologies enabling it are evolving to become more immersive, more embedded in the world around us. We might call this a “Spatial Web” to quote Gabriel René and Dan Mapes. Or maybe an embodied web. Or maybe just say ambient computing.

Digital transformation can be classified into four stages. First, the personal computer connected humans to an intelligent machine. The internet connected individuals with sources of knowledge. Mobile devices connected people with billions of other people. Now, in the fourth wave, an emerging web will connect our digital and physical worlds.

That will be enabled through a complex set of core technologies. The first layer is the interface layer - how users physically connect with and experience an immersive web, including Augmented Reality (AR), Virtual Reality (VR), Mixed Reality (MR), Robotics, Wearables, and Internet of Things (IoTs.) The next layer, the logic layer, will leverage machine learning and artificial intelligence to make sense of all that data. And the data layer, which includes blockchain and edge computing, will allow all of this to be accomplished securely and quickly. I’m stealing terminology from The Spatial Web by René and Mapes. A great read!

For end users, the most obvious (and hyped) piece of that tech stack is augmented and virtual reality.

  • Virtual reality (VR) immerses users in a fully artificial, digital environment, most commonly via VR Headsets like the Oculus Quest 2. One of the more popular examples of VR in action is depicted in the movie Ready Player One.

  • Augmented reality (AR) overlays virtual objects onto the real world to enhance a user’s perceived environment. The worldwide phenomenon, Pokémon GO, is one of the most well-known examples. Furniture companies like IKEA and Wayfair both use augmented reality enabling users to view furniture in their homes before they make a purchase.

  • A third category exists, which we call Mixed Reality. Mixed reality doesn’t just overlay but anchors virtual objects into the real world allowing users to interact with the real world and virtual environment, eventually blurring the lines between what is real and what is not. One of the better-known examples of MR is Microsoft's Hololens 2, which is a holographic computer worn around the head. The device projects holograms that can be manipulated and interacted with as though they existed in a user’s physical surroundings.

We’ll continue to see these definitions become more arbitrary as technology ultimately evolves toward mixed reality.

Looking Forward

In the last decade, consumers have seen significant growth of AR and VR applications. However, many have been devoid of a true user-needs assessment. In the next five years, we’ll see innovation separate from hype. This will likely breed use cases for immersive experiences that focus on the technology’s true value adds:

  • Visualizing data spatially to speed up interpretation, improve comprehension, or build empathy.

  • Simulating environments to reduce risk or make an experience more accessible.

Companies that keep these parameters (and their end users) in mind will prevail.

These thoughts are my own and do not represent the viewpoints of any company or organization with which I’m affiliated.

Previous
Previous

XR, What is it Good For? Absolutely Some Things

Next
Next

Signals of Hope for Medical Extended Reality