The iCub 3 robot allows users to touch and feel items anywhere in the world. Now it aims to be ever present in the metaverse.
Have you ever wanted to visit a city you’ve never been to without hopping on a plane to that desired destination? Feel the soft sand on a beach in Barcelona, or roam amid the iconic structures in Rome? This is the primary goal of the new iCub robot advanced telexistence system, also called the iCub3 avatar system, developed by researchers at the Italian Institute of Technology (IIT) in Genova, Italy.
IIT first tested the new system in a demonstration in November 2021, where the robot was at the 17th International Architecture Exhibition’s Italian Pavilion, while the operator was 300km away at an IIT lab in the city of Genova. A standard fibre-optic connection was used to link the two.
The operator used a suite of wearable devices, known as the iFeel system. These gadgets include multiple IMUs (inertial measurement units) placed at various locations on a body suit: ‘gloves’ that both track the user’s finger movements and relay tactile sensations from the robot’s finger pads, and a VR headset that tracks the user’s facial expressions, eyelids, and eye movements, picks up their voice, and allows them to see what the robot is seeing and hear what it is hearing.
During the demonstration, the robot served as an avatar for the operator. It walked around the pavilion, had a conversation with a human tour guide, shook their hand, and even hugged them: for the latter, haptic feedback units in the torso bodysuit let the operator feel that hug.
The researchers showed that the system transports the operator locomotion, manipulation, voice, and face expressions to the robotic avatar, while receiving visual, auditory, haptic and touch feedbacks. This marks the first time researchers have tested a system with all these features using a legged humanoid robot for remote tourism, allowing the human operator to feel and experience where the avatar is.
“We validated our iCub 3 avatar system on a legged humanoid robot, allowing for remote verbal, non-verbal and physical interaction,” says Daniele Pucci, head of the Artificial and Mechanical Intelligence Lab at IIT. “This represents a perfect starting point when looking for platforms to emulate humans for all interaction aspects.”
Pucci envisions the system having a future in Mark Zuckerberg’s recently launched virtual-reality social network. “What I also see in our near future is the application of this system to the so-called metaverse, which is actually based on immersive and remote human avatars,” he explains.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.