Conceptual knowledge predicts the representational structure of facial emotion perception

By: Jeffrey A. Brooks and Jonathan B. Freeman
Published in Social Sciences
Like

Share this post

Choose a social network to share with, or copy the URL to share elsewhere

This is a representation of how your post may appear on social media. The actual post will vary between social networks

The paper in Nature Human Behaviour is here: go.nature.com/2MhFYgr

Whether we’re looking into the eyes of our closest friends and loved ones, or glancing at a stranger on the street, we seem immediately and instinctively aware of what people are feeling. We see anger in scowls and glares, sadness in frowns, joy in laughter. Like many of the most interesting topics in psychology, this ability is completely transparent to us – we don’t have to think or try too hard to receive this information from mere muscular actions on the face.

 

But even though we often take it for granted, emotion perception is a fascinating and impressive ability – what is it about the human visual system, our minds, or the face itself that allows us to do this?

 

Emotion perception has been studied in psychology since laboratories for experimental psychology sprang into existence around 150 years ago. Some of the most classic and influential ideas about emotion perception hold that facial expressions of emotion are evolved signals for communicating our internal states to others. While we evolved to signal emotions with our faces, we co-evolved the ability to recognize these signals in the faces of others. This conforms pretty well to our everyday experience of emotion perception, where we seem to be able to immediately read emotion from the faces of others.

 

However, a number of studies have also shown that when you change the experimental paradigm a little bit to give people more information than just the face – like a visual scene or body language – we are not as accurate at reading these signals that are supposedly available right on the face. Sometimes these studies showed that when the surrounding context – scene, body, or voice – delivers a different message than the face (like someone looking happy, but with a slumped body posture), we are more likely to pay attention to the surrounding information in order to attribute an emotion to someone else (in this example, labeling someone sad rather than happy as their face would suggest).

 

A provocative idea that has been recently influential in the field of emotion perception is that everyone has their own individual “context” that is able to influence it. Instead of a scene, voice, or body posture influencing our categorizations, then, another possibility is that our own pre-conceived ideas, memories, and knowledge about emotions can influence categorizations in a similar way. So if you and I have different ideas about what “anger” really means, we might make different use of information from the face and environment to determine when people are feeling angry.

 

We devised a set of studies to test this idea directly. We showed people faces displaying expressions from the six “basic” emotions – anger, disgust, fear, happiness, sadness, and surprise. These are the most commonly studied emotions and the ones that are most likely (based on prior research) to be the emotions that we evolved specific facial signals for. We had people categorize each face as quickly as they could as one of two emotion categories with a computer mouse click. For example, they might see a traditionally “angry” scowling face and have to categorize it as “angry” or “disgusted” by clicking on those words on the screen.

 

We used a computer mouse-tracking method to capture people’s mouse movements as they moved the mouse toward either response. This allowed us to gain insight into the process of emotion perception, including how a given interpretation evolved over hundreds of milliseconds. We also measured how conceptually similar people find each pair of emotion categories – for example, how similarly they think about “anger” and “disgust”, how these emotions feel, and what situations elicit these emotions.

 

We found that when people thought two emotions were more conceptually similar, they were more likely to show a simultaneous attraction to both responses while they made their categorizations – for example, swerving the mouse toward “disgusted” before settling on their final response of “anger”.

 

If we had only paid attention to people’s final responses, we might find evidence consistent with classic ideas about emotion perception – that pretty much everyone is able to “correctly” identify these emotions from their traditionally understood facial expressions. Instead, by measuring the underlying process using mouse-tracking, we found that the process is subtly influenced by people’s individual ideas about emotions – regardless of whether their final response was “correct”.

 

Moving forward, we hope this work can inspire more researchers to use new methods when they study emotion perception, to gain more information than a final button-press response might suggest. In the scientific fields that study emotion, it is hard to disentangle debates about “how do we perceive emotion” from debates about whether facial expressions “signal” emotion like classical theories suggest. While our research does not necessarily speak to the inherent evolutionary meaning of facial expressions, it does show that everyone’s unique ideas and experiences can influence the way we perceive emotions. We hope that this work can help us reach a more detailed understanding of our fascinating ability to perceive emotion in others. If so, we hope that it might translate to a new understanding of why emotion perception is so difficult for some individuals, and how the intricacies of this process can shape our impressions of others and the ways we behave toward them. 

Please sign in or register for FREE

If you are a registered user on Research Communities by Springer Nature, please sign in

Go to the profile of Jenn Richler
over 6 years ago

This is such a great study (and it makes me a bit nostalgic for my former research life!)