Monday, April 29, 2024
nanotrun.com
HomeAnswerBillions of dollars spent on XR development

Billions of dollars spent on XR development

At Connect 2021 last week, Meta-reality Lab's chief scientist, Michael Abrash, gave a high-level overview of some of the billions of dollars the company is spending to advance XR and meta-universe development.
Michael Abrash leads the Meta-reality Lab research team, which is tasked with researching technologies that the company believes could lay the foundation for XR and the Meta-universe in the coming decades. At Connect 2021, Abrash shared some of the organization's latest work.
Meta's Codec Avatar project aims to implement a system capable of capturing and representing lifelike Avatars for use in XR. Aside from simply "scanning" a person's body, a major challenge is getting it to move realistically, let alone getting the entire system to work in real-time so that the avatar can be used in an interactive environment.
The company has demonstrated its codec Avatar work on various occasions, showing improvements each time. Initially, the project started with high-quality avatars but has since expanded to full-body avatars. Researcher Yaser Sheikh says there is now support for more complex eye movements, facial expressions, hand and body gestures, all of which involve self-contact.
With the possibility of such realistic avatars in the future, Abrash admits it is important to consider the security of personal identities. To that end, he said, the company is "looking at how you can keep your avatar secure, whether by tying it to an authenticated account or other ways of verifying your identity."
While Meta's codec Avata already looks pretty convincing, the team believes the ultimate goal of the technology is photo fidelity. Abrash presented what he says is the latest work of the research group on realistic hair and skin rendering and lighting. This doesn't claim to be happening in real-time (we also doubt it's true), but it's the team's goal to go down the road with the codec technology for Avatar. In addition to the high quality of physical representation, Meta expects clothing to continue to be an important way for humans to express themselves in the metaverse. To that end, they thought making the clothes realistic would be an essential part of the experience.
While XR can easily transport us to other realities, it's also great to virtually teleport friends into your actual living space. To take an extreme example, this means you can have a complete reconstruction of your real home and everything in it, running in real-time. That's what Meta does. They built a mock apartment where everything was a perfect replica.
Doing so allows the user to move around in real space and interact with it as usual while keeping the virtual version in sync.
So if you happen to have virtual guests visiting, they can watch you move through the real world space and interact with whatever's inside in an incredibly natural way. Similarly, having a map of the space with this verisimilitude makes AR experiences and interactions more compelling when using AR glasses.
Facebook researchers have shown off a prototype of virtual reality "reverse trip" outside the headset For now, it seems possible to build a "best-case" scenario for companies to experiment with within real-world environments. If Meta finds that having this perfectly synchronized real and virtual space is important for valuable use cases, it may explore ways to allow users to easily capture their own space with similar precision.
RELATED ARTICLES
- Advertisment -
nanotrun.com

Most Popular

Recent Comments