At metaverse event, Meta's Zuckerberg unveils work to improve how humans chat to AI
Technology
At metaverse event, Meta's Zuckerberg unveils work to improve how humans chat to AI.
(Reuters) - Facebook-owner Meta (FB.O) is working on AI research to allow people to have more natural conversations with voice assistants, CEO Mark Zuckerberg said on Wednesday, a step towards how people will communicate with AI in the metaverse.
The company s Project CAIRaoke is "a fully end-to-end neural model for building on-device assistants," said Zuckerberg, speaking at Meta s live-streamed artificial intelligence event.
Zuckerberg is betting that the metaverse, a futuristic idea of virtual environments where users can work, socialize and play, will be the successor to the mobile internet.
The social media company, which recently lost a third of its market value after a dismal earnings report, has invested heavily in its new focus on building the metaverse and changed its name to reflect this ambition. This month Meta reported a 2021 net loss of $10.2 billion from its Reality Labs, the company s augmented and virtual reality business.
Meta also recently announced its research team has built a new artificial intelligence supercomputer that it thinks will be the fastest in the world when completed in mid-2022. read more
Zuckerberg said Meta was working on a new class of generative AI models that will allow people to describe a world and generate aspects of it. He showcased an AI concept called Builder Bot -- which allows users to describe what they want the AI to generate. He showed a demo where he, as a legless avatar on an island, commanded AI through speech to create a beach and then add clouds, trees and even a picnic blanket.
"As we advance this technology further, you ll be able to create nuanced worlds to explore and share experiences with others, with just your voice," said Zuckerberg.
Meta is preparing for how AI could interpret and predict the types of interactions that would occur in the metaverse, by working on "self-supervised learning" - where AI is given raw data rather than being trained on lots of labeled data.
Zuckerberg said Meta was also working on egocentric data, which involves seeing worlds from a first-person perspective. He said it had brought together a global consortium of 13 universities and labs to work on the largest ever egocentric dataset, called Ego4D.
The company also said it was working to make a single AI system capable of translating between all written languages. Zuckerberg also announced that Meta was working on a universal speech translator, aiming to provide instant speech-to-speech translation across all languages.