How eye contact shapes the believability of computer-generated faces
The direction a computer-generated character looks can dictate whether their facial expressions seem like genuine emotional responses to human observers. Direct eye contact makes simulated smiles and angry glares look more authentic, while looking downwards makes a digital face expressing sadness seem more real. These findings were published recently in Cognition and Emotion. Digital characters frequently appear in online therapy programs, video games, customer service applications, and virtual companionship software. To succeed in these roles, virtual humans must build a sense of rapport with the users interacting with them. Doing so requires the digital characters to display emotional states that human users interpret as authentic. Because virtual figures do not possess actual feelings, they rely entirely on visual cues to simulate a genuine state of mind. Previous research has explored how physical features shape the way people interpret an emotional display. To determine if a smile is a true reflection of happiness, a person will often look for the crinkling of the skin around the eyes. Observers commonly interpret these eye wrinkles as a sign …









