Why Facial Coding in Consumer Research is Flawed (And What We Should Be Doing Instead)
Alright, let’s talk about facial coding and why it’s not the magic bullet for understanding consumer emotions that some people think it is. This topic was discussed a bit at the recent SSP 2024 conference in Pittsburgh during the Emotions Session after Holly Miller’s talk discussed in a previous blog post.
If you’re in the world of consumer research, you’ve probably heard of facial coding. The idea is simple: analyze people’s facial expressions to figure out what they’re feeling. A smile means happiness, a frown means sadness, and so on. Sounds great in theory, right? But here’s the problem—it’s just not that simple. And honestly, facial coding isn’t giving us the emotional insights we think it is.
Let me break it down.
Emotions Are Constructed, Not Hardwired
One of the leading voices pushing back on facial coding is Lisa Feldman Barrett, a neuroscientist whose work on emotion has seriously shaken up the way we think about how emotions work. According to her Theory of Constructed Emotion, emotions aren’t these hardwired, universal reactions we all share. Instead, our brains construct emotions on the fly, based on context, past experiences, and sensory input.
So, what does this mean for facial coding? Well, it means trying to understand someone’s emotional state just from their facial expressions is like trying to guess what movie someone’s watching based on their snacks—it might work occasionally, but mostly, it’s a shot in the dark. Emotions are way more complicated than “smile = happy” or “frown = sad.”
The Science Backs It Up
Barrett dives deep into this in her 2019 paper “Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements.” She points out that we need to stop assuming that facial expressions give us a reliable read on someone’s emotional state. Sure, people might smile when they’re happy or scowl when they’re angry, but the science shows that’s not a consistent or universal rule.
Emotions—and how people express them—vary a ton. Across cultures, across different situations, and even from person to person. What one person shows as anger, another might not express in their face at all. And the same facial expression can mean totally different things depending on the context—a scowl could indicate anger, confusion, focus, or even a bad smell. So if you’re relying on facial coding to understand emotions, you’re not necessarily getting the full picture.
Even Actors Don’t Follow Stereotypes
Barrett’s 2021 study “Professional Actors Demonstrate Variability, Not Stereotypical Expressions,” makes this point even clearer. They looked at professional actors (who are trained to express emotions) and found that even they don’t rely on stereotypical facial expressions when portraying emotions. Instead, their facial movements vary based on the context of the scene they’re acting out.
And when participants rated the actors’ emotions, their interpretations varied too—especially when they didn’t know the context of the scene. So, even with trained professionals, facial expressions aren’t giving us reliable cues about emotion. The context really matters.
So, Why Are We Still Using Facial Coding?
It’s easy to see why facial coding caught on in consumer research. It seems simple, scientific, and gives us clear results. But, as we’ve seen, the science behind it just doesn’t hold up.
Consumer behavior is complex, and so are emotions. If we really want to understand how people feel about a product or experience, we need to look beyond their faces. Emotions don’t exist in a vacuum—they’re shaped by context, expectations, and individual differences.
What Should We Be Doing Instead?
I’m not saying facial coding has no place in consumer research—it’s a tool, but it’s not the whole toolbox. A more effective approach would involve combining it with methods that dig deeper into what’s driving consumer behavior. One of the best ways to do this? Implicit association testing and behavioral frameworks.
With implicit association tests, we can tap into automatic, unconscious associations people have with a product or experience—insights that go beyond what we can see on the surface. These tools can reveal underlying emotional connections and preferences that people might not even be aware of.
And if we start designing research with behavioral science frameworks, we can focus more on how context shapes emotion. By looking at how different situations, environments, or even social settings affect the way consumers react, we can get a much more accurate picture of what’s really going on. This approach helps us capture reactions that might not show up in facial expressions but are still crucial for understanding consumer decision-making.
Moving Past Quick Fixes
It’s tempting to want a simple, one-size-fits-all answer to how someone feels, but emotions—and consumer behavior—don’t work that way. Instead of relying solely on facial coding, we need to use methods that embrace the complexity of emotions. Tools like implicit association testing or context-driven behavioral approaches allow us to see the bigger picture and get insights that are actually useful for making business decisions, especially when combined with other traditional consumer research tools like MaxDiff or drivers of liking studies.
At the end of the day, we need to be asking better questions and using better tools. Facial coding can be part of the mix, but it’s time we move past oversimplified views of emotion and start understanding consumers in a more nuanced, context-specific way.
And that’s my take on why facial coding isn’t cutting it. If we want to get serious about understanding emotions and behavior, we need to dig deeper—into context, into unconscious biases, and into how people actually experience the world around them.
Let’s do better.