I couldn’t be further from the belief that Mark Zuckerberg is wasting his money and time on the metaverse. It’s cool to hate on Meta, and maybe they have done plenty justify the general dislike as a company. But you cannot deny the impressive work they are doing. I believe very strongly our future has a metaverse in it.
This article is not focusing necessarily on the applications of a metaverse product, but the VR tech that is making it a truly immersive experience.
In these videos, some of which are only a few seconds long, you’ll see why!
1. Photo-realistic avatars
When you think of Mark Zuckerberg and metaverse avatars, most people think of this:
But that’s just this generation’s avatar. Let’s look forward in time a decade to what Meta is really working towards:
I’m sure you’ll agree these photoreal avatars are leaps and bounds ahead of what Meta is currently showcasing.
You’d be forgiven for wondering why they are even bothering releasing the current version. The current offering is laughable. But Meta is a business, and it must try and make money in the meantime.
You might be thinking, “that’s cool, but so what?”. I hear you. This is just one piece in the metaverse puzzle. Avatars that don’t look goofy is a big hurdle to overcome. Next, is being able to do that at scale.
2. Instant avatars, using your phone’s camera
One of the challenges to overcome is making the technology accessible. The previous video demonstrated a need for many cameras & high tech equipment to produce an exceptional result.
But what if you could create a VR avatar, using a quick scan from your phone camera?
This is an avatar generated at speed that could give the uncanny valley a run for its money.
The quality of the avatar is fantastic, but that isn’t what’s most remarkable. It is the tangibility of the demonstration – using a simple smartphone camera – that makes this example impressive.
I can see even my own techno-phobic mother being able to workout how to scan her face by holding out her phone like she’s taking a selfie. It’s breakthroughs like this that make mainstream VR and metaverses feel like a real possibility.
3. Full-body avatars with wardrobe changes
By the way, these avatars have legs. Not only that, they have full 360 form and function. In fact, you can even go ahead and change your outfit at any time:
Let’s talk about applications here. Malls are closing down across America. Instead of going clothes shopping, why not step into the metaverse and try on some outfits? From the comfort of your living room, you can quickly scan through hundreds perfectly tailored outfits, at any angle you desire.
One step closer to Ready Player One? It feels that way to me!
4. Virtual gloves that you can feel
Just because you are in a virtual world, doesn’t mean you can’t touch and feel the things around you. With haptic feedback gloves, you can do just that:
The metaverse might feel like a hollow experience if you can’t physically interact with people or objects around you.
Haptic feedback gloves bring virtual interactions to life. When you can feel a handshake or a fist bump, it blurs the lines between virtual and physical worlds.
This one still feels very much in the research phase, but you sense in a decade it will be the norm in the VR world.
5. Control everything with the flick of the wrist
Another barrier to mainstream adoption are the clunky controllers that effectively tether you to your desk, or living room.
Using a device that is roughly the size of a large watch on your wrist, Mark demonstrates how you can harness motor neuron signals to navigate interfaces in the augmented world:
Best of all, the device learns about you and adapts to your unique movements. You don’t learn how to use the device, it learns how you want to use it.
The shrinking of devices, controllers, and headsets makes the VR and AR experience more palatable for a mainstream audience. The future might not look so goofy afterall.
6. Mimic motions an entire body with only a few sensors
How do you realistically recreate our motions in a virtual world, when you only have a few devices tracking you?
With just a controller, head set and bracelet, the system is able to estimate what posture you’re most likely adopting, even if every limb and joint on your body full body isn’t being tracked.
The motion of your avatar in the virtual space adapts to other virtual objects or terrains it encounters. So if you’re walking over an uneven surface, your avatar will move accordingly, even if your surface in the real world is flat.
The motion estimation can also seemingly upscale or downscale to fit avatars of different heights or proportions. So, if you want to one day embody an avatar the size of the iron giant, or as small as Elmer Fudd, it might just be a reality someday!
7. Displays that pass the turing test.
The turing test is a test of a machine’s ability to be indistinguishable from a human. Can we truly blur the lines between virtual worlds and real worlds, to the point that you can’t tell you’re in a virtual world?
This last video is a long one, but super interesting. It’s all about the eyes.
Meta are trying to tackle issues such as eyes that focus on things close to you and further away – varifocal technology & depth of field. How can they ensure text is legible, crisp, and comfortable to read for long periods?
Replicating the human eye at a high 60 DPI is immensely challenging. The latest prototypes are pretty astounding when it come to clarity of display:
And that’s just a snapshot of the progress being made by Meta in their quest for metaverse dominance. More innovations are being showcased every week.
In a decade, it feels realistic virtual reality headsets will be commonplace. Even if you’re not convinced we’ll all be living in the metaverse in the 2030’s, you can hopefully see the potential of the technology.
Meta’s multi-billion investment in VR technology might be looked back on as incredibly foolish or wasteful. It might also yield the most incredible leap forward in technology since the birth of the internet.
I am at least hopeful, if not certain, the latter will be true.