Zuckerberg is not the first visionary to try this. Remember Second Life? It is still active. In its prime, it was extremely popular. I visited Linden Lab’s office back in 2007 in San Francisco and was spending a good amount of time every day on the platform. It has remained a nice virtual environment, similar to an online game.
The founder, Philip Rosedale went on to find High Fidelity, with the intent to build an even better version that included VR capabilities. It seems with this initiative that he, again, was a visionary way before his time. Today, they have pivoted to provide spatial audio features; nothing more.
In 2010 another visionary, Rony Abovitz, launched a very mysterious start-up, called Magic Leap. After a while they revealed what they were working on: a mixed reality headset seeing the real world and overlaying a virtual world on top of it. They designed effective videos demonstrating really cool use cases. However, they suffered with the development and deployment of the technology.
Google, now Alphabet, worked on the so called Glass project. Microsoft has their Hololens mixed reality device. Taiwanese HTC calls their VR headset, Vive. Facebook has their hands in the game, too, by acquiring Oculus in 2014 to encourage VR technology within the Facebook world. Still, the tech has not yet had the opportunity to turn into a mainstream device, either.
But why isn’t virtual reality scaling?
One reason might be that carrying such a device with you is not very practical. We have our mobile phones with us. That’s the reason why augmented reality is more applied to various use cases.
Another reason might be that VR usage causes motion sickness for some. It definitely does for me. I try out different applications, games, and use cases every now and then, and most of the time, especially if there is some sort of movement at play, my body – brain coordination gets out of balance, and I feel sick within just a few minutes.
So, I can’t attend a one-hour training or play a game for too long. Why should I bother then, to jump in a virtual world if I will get motion sick and can’t continue longer than just a few minutes?
Virtual reality use cases must become richer.
This is not just about gaming or entertainment. From business to education, from healthcare to architecture, creative ways of using AR, VR, or MR are getting better and better. The real world is three-dimensional. But our screens are two-dimensional. So, we are missing one dimension. Our brain is built to interpret a 2D image into 3D. But diving into a 3D reality is incredibly powerful.
Imagine you can experience a construction site virtually and remotely in perfect detail. Sound good? Yes, unbelievably valuable. Saves on travel cost, exposes mistakes or misunderstandings, gives all stakeholders a different version of reality, and allows crowded teams collaborate virtually and remotely. In architecture and product design, it makes perfect sense. 3D related use cases are predestined to be enriched with augmented or virtual reality extensions.
The human body is 3D as well. Therefore, healthcare applications make a lot of sense. Investigating the anatomy of the body, looking into a CT scan in 3D, enriching a real surgery with augmentation to support the surgeon with live data, and similar use cases could revolutionise medical services. Many use cases have been envisioned, but the bloody reality is still the regular emergency room.
The same is true for education. This would require all content to be upgraded to cater to 3D. Now, most educational content is 2D, static, not animated, and quite outdated. Turning all educational content into 3D, animated, colourful, enriched, interactive, and even collaborative experiences would be a game changer. Conceptually it has already been envisioned. Infrastructure is not yet ready, and the content needs to be upgraded…
How far can we stretch the digital twin?
The digital twin is a digital representation of a physical thing. GE is famous for their digital twins of the wind energy power turbines and jet engines.
To do so, all the equipment needs to be 3D modelled, equipped with sensor and connectivity, sensor data must be captured online, and then put into a digital twin model, visualised on a screen in 2D. With virtual reality you could fly right to the wind turbine, look at it in 3D and understand what is going on, as if you were there. Isn’t that powerful?
What else could we model and simulate like that? A whole building? A whole hospital? A whole factory? A whole warehouse? Shall we go one step further? A whole city? A whole power gird of a region or the whole nation? Would that be possible? Could we collect all the sensor data and have a computer crunching it in real-time, visualising it in real-time, and then even providing an augmented version in 3D, interactive, simulative, and collaborative?
Has the time finally come?
Metaverse promises to be able to do that, but in the future. How long must we work on this, nobody knows. Mark Zuckerberg showed in his Connect 2021 keynote presentation various scenarios how 3D and virtual reality applications could enrich us. They all look great. Interestingly none of them are new. Second Life, High Fidelity, Google Glass, Magic Leap, and many others promised similar experiences. Why did they not succeed?
Has the time now come? Will Meta be luckier than their predecessors? This depends on many factors. The hardware, the content, the use cases, the bandwidth, the ecosystem, the developers, the business model, and many other factors will play a key role. But the most important reason will be the human factor. Will we adopt this technology? Time will tell; nobody can predict for sure.
What could go wrong in the metaverse?
Hollywood loves dystopian scenarios. They have produced one for virtual reality as well. You might remember this movie called Ready Player One. In a not so far future, people are suffering and escaping their sad realities by diving into virtual worlds, where they play, pretend to have much better lives, and make new friends. You can imagine how the story will evolve…
In a Black Mirror episode called Play Test the virtual reality sensors and experiences were taken to the extreme. The player was not able to distinguish between game and reality anymore. This happens already in real life. Back in 2010 a South Korean couple received a baby and while they were playing an online game they forgot to feed their baby.
All technology is to serve the human condition and its progress. We must not forget this. This must be the overarching design principle for all scientific and technological advancement. This must be the case for autonomous weapon systems and for virtual reality. So, be cautious and conscious when you produce or consume technology.