Skip to main content

Transforming the understanding
and treatment of mental illnesses.

Lilly Kelemen, Winner of the 2024 NIMH Three-Minute Talks Competition

Transcript

Lily Kelemen

My name is Lily Kelemen, and I'll be telling you about what's in a face and when do we know it's there. We get a lot of information when we look at a face. We understand things about who a person is, like their age, their race, and their gender, and we also get information about how they're feeling, like where their face is looking or what expression they have. These are examples of stimuli that have been used in the past to study facial emotion expression, but when you look at these images, there's something a little bit off. To prove that point let's pretend that you're commuting to work, you got your coffee in hand, the metro door is open, but everyone inside looks like this. Now you'd be pretty freaked out, and that's because when we look at people in everyday life, their faces aren't tightly cropped in an oval.

They're not always looking directly at us, and their expressions aren't always pulled in this animated way.

So instead in this project, I'm focusing on using more naturalistic stimuli. These are examples of three images from the Wild Faces Database, or WFD. These are images that we've gleaned from the internet, and they're supposed to represent more naturalistic images. The faces aren't always oriented directly 90 degrees at the camera, and the expressions aren't always in a fixed posed position. We ask participants to look at groups of three of these thousand images and pick the odd one out. Now, there's lots of ways you could do this. For example, some might say that this one's the odd one out because of perceived gender, or they might say that this one's the odd one out because the two on the left appear to be happy, while the one on the right appears to be angry.

Either way, you'd be right. We fed that behavioral data into an algorithm, and that gave us this multidimensional scaling plot, which put the images that were rated more similar, closer together, and the images that were more different farther apart. There's many different patterns that emerge when we look at this multidimensional scaling plot. For example, the images that were perceived female are clustered here, and the ones that were perceived as male, are clustered here. The ones that look happy are in this corner, while, the ones that look angry were in this corner.

In the present study, I'll be asking participants to view these images during a magnetoencephalography scan, or MEG. There is 275 sensors hidden in that helmet there, that measures the magnetic activity that arises from electrical activity of firing neurons in the brain. The data resulting from an MEG scan might look something like this. Here I'm plotting the magnetic activity at one sensor over time. I'd expect to see something that's more constant across faces like age, race, or gender to spike before something that's more changing, like expression. This research is important because it helps us understand how the brain processes emotion on human faces, and it can help lay the foundation for understanding disabilities that make it more difficult to read expression, like autism or intellectual disability. Thank you.