Android goes beyond Uncanny Valley with expressive FACE

It’s but obvious to feel terribly uncomfortable when we see humanoid robots trying their level best, but still failing to look human. Often labeled as Uncanny Valley principle, the very limitation has inspired researchers to seek ways to go beyond it and incorporate lifelike expressions into robotic faces. An Italian team of roboticists at the University of Pisa has created the FACE humanoid robot that mimics realistic human facial expressions to considerable perfection.

Aiming to pass over that thin line that separates ‘realistic’ from ‘not realistic enough’, Nicole Lazzeri have conceived and created a Hybrid Engine for Facial Expressions Synthesis (HEFES) that utilizes 32 motors around FACE’s skull and upper torso to deliver those realistic expressions. A combination of motor movements relies on Facial Action Coding System (FACS) for controlling the polymer facial skin and anatomic muscle movements of the robot.

Thanks to the mathematical program HEFES, the researchers could attest the efficacy of their study by asking five autistic and 15 non-autistic kids to identify a set of expressions, which included anger, disgust, fear, happiness, sadness, and surprise. The expressions were first performed by Face and later by a psychologist. While kids could identify sublime emotions like happiness, anger and sadness, they (the bot rather) failed to identify fear, disgust and surprise.

Via: NewScientist

Today's Top Articles:

Scroll to Top