Researchers at Columbia University’s Creative Machines Lab have unveiled “Emo,” a humanoid robot that can mimic facial expressions in real time. As detailed in a study published in Science Robotics, Emo can anticipate and mirror human expressions, even smiling back 839 milliseconds before you do.
Traditional robots face delays when imitating human expressions, often resulting in an artificial feel. Emo, however, addresses this with a realistic human-robot head equipped with 26 actuators for precise muscle movements, and high-resolution cameras for eye contact. The actuators are covered by blue silicone skin, giving Emo a lifelike appearance.
Two AI models power Emo: one predicts human expressions, and the other controls the actuators. Trained on videos of facial expressions, Emo learned to replicate even subtle muscle movements within hours. The robot accurately predicts human facial expressions about 70% of the time and can perform actions like raising eyebrows and frowning.
Currently, Emo interacts only through facial expressions, but researchers aim to integrate speech capabilities using large language models like ChatGPT. The team acknowledges that Emo’s lip movements need improvement. As robotic technology advances, future versions of Emo might feature full-body mimicry, potentially assisting humans in various tasks.