Mark Zuckerberg says a new material could support Metaverse ambitions
Facebook boss Mark Zuckerberg is fighting in the “Metaverse” with an Olympic gold medal fencer during a live streaming virtual and augmented reality conference to announce the renaming of Facebook to Meta, in this screenshot from a video from October 28th 2021.
Facebook | via Reuters
Facebook co-founder Mark Zuckerberg, now Meta’s CEO, said Monday that a new touch sensor and plastic material could work together to potentially aid the development of what is known as a metaverse.
Together with scientists from Carnegie Mellon University, Artificial Intelligence researchers from Meta have created a malleable plastic skin less than 3 millimeters thick.
The relatively cheap material, known as ReSkin, contains magnetic particles inside that create a magnetic field.
When the skin comes into contact with another surface, the magnetic field of the embedded particles changes. The sensor records the change in magnetic flux before relaying the data to AI software that tries to understand the force or touch being exerted.
“We developed a high-resolution touch sensor and worked with Carnegie Mellon to develop a thin robotic skin,” Zuckerberg wrote on Facebook on Monday. “This brings us one step closer to realistic virtual objects and physical interactions in the metaverse.”
The skin was tested on robots that handled soft fruits such as grapes and blueberries. It was also put into a rubber glove while a human hand formed a bao bun.
The AI system had to be trained on 100 human touches to make sure it had enough data to understand how changes in magnetic field relate to touch.
The work is due to be published in a scientific journal later this month, but has yet to be reviewed by experts.
Touch has been largely neglected by AI researchers because touch sensors were too expensive or too thin to get reliable data, Abhinav Gupta, a researcher at Meta, said in a media call on Friday.
“When you think about how humans or babies learn, abundant multimodal data is very important in developing an understanding of the world,” said Gupta. “We learn from pixels, sounds, touch, taste, smell and so on.”
“But if you look at how AI has evolved over the past decade, we’ve made tremendous advances in pixels (computer vision) … and we’ve made advances in sound: audio, speech, and so on. But touch was missing in this development, even if it is very critical. “
When machines and robotic assistants help with feeling, they can understand what humans are doing, Gupta said, adding that Meta’s ReSkin can detect forces of up to 0.1 newtons from objects that are less than 1mm wide.
“For the first time we can try to better understand the physics behind objects,” said Gupta, adding that this will help Meta find a metaverse.
The metaverse is either the internet’s next evolution or the latest corporate buzzword to get investors excited about a nebulous innovation that may not even materialize in the next decade.
Either way, technology companies – primarily Meta – are increasingly promoting the concept of the metaverse, the term for a virtual world in which you can live, work and play. If you’ve seen the movie Ready Player One, you’ll have a pretty good idea of what the metaverse is: put on computer glasses and you’ll be transported to a digital universe where anything is possible.
When Meta’s metaverse ambitions come to fruition, it may be possible to interact with virtual objects and get some sort of physical response from a piece of hardware.
“When you wear a meta-headset, you also want it to have some tactile feel so that users can have an even richer experience,” said Gupta.
“How can you give haptic feedback if you don’t know what kind of touch a person feels or what material properties are present and so on?”
– Additional coverage from CNBC’s Steve Kovach.