Skip to main content

Transforming the understanding
and treatment of mental illnesses.

Machine Learning Study Sheds Light on Gaze Patterns in Adults With Autism

Research Highlight

Neuroscience researchers often look to our gaze patterns—what we tend to spend time looking at—to gain insight into the human brain and behavior. Many studies have indicated that people with autism spectrum disorder (ASD) and people without ASD tend to have noticeably different gaze patterns when looking at social stimuli, which may reflect meaningful differences in social processing. However, studies have also shown wide variation in gaze patterns among people in the same group and even within individual people, raising the question of whether gaze patterns reflect a reliable trait.

A recent study conducted by researchers at the National Institute of Mental Health (NIMH) shows that although the relative amount of time people spend looking at different facial features varies among people, individual-level gaze patterns are consistent for both people with ASD and people without ASD. The findings, based on machine-learning analyses of eye-tracking data, also suggest that accurately estimating individual gaze patterns for people with ASD may require more data than previously thought.

For this study, researchers in the Section on Cognitive Neuropsychology in the NIMH Intramural Research Program examined data from 33 adult males who met specific criteria for the category of “broad autism spectrum disorders.” For comparison purposes, each participant with ASD was matched with another male participant who had a similar age and IQ but who did not have an ASD diagnosis.

Participants watched an 8-minute series of 22 movie clips depicting social interactions with two or more characters engaged in conversation. The researchers used eye-tracking technology to record participants’ eye movements as they watched the clips.

For their analyses, the researchers trained a machine learning algorithm to quickly classify each pixel in each frame of each movie clip, labeling the pixels as belonging to a specific body part (e.g., eye, nose, mouth) or to the background. The researchers linked these labels with participants’ eye-tracking data to identify what each participant was looking at in each frame of each clip.

The researchers then investigated the consistency of gaze patterns at both the individual level and the group level.

At the individual level, the researchers found that participants with ASD showed less consistency in what they looked at across movie clips compared with participants who did not have ASD. That is, the proportion of time that a participant with ASD spent looking at specific facial features was less consistent from clip to clip. This result aligns with previous findings showing considerable variability in gaze patterns among people with ASD.

However, as the researchers included more data in their analyses, stable gaze patterns emerged for participants with ASD and their peers without ASD—on average, each participant showed specific gaze patterns, favoring particular facial features over others. This suggests that these gaze patterns reflect a reliable individual trait for participants with ASD and those without ASD.

While there were noticeable differences in gaze patterns among individuals in each group, the researchers observed some robust differences between the groups, as well. In general, participants with ASD spent less time looking at faces, particularly the center of the face, compared with participants without ASD. Participants with ASD also spent relatively more time looking at the background areas of the clips.

It is important to note that additional studies will be needed to determine whether the findings generalize beyond this small sample of adult males to larger and more diverse groups of participants. In particular, studies with children will be essential for understanding how gaze patterns emerge and change across development.

Taken together, the study adds some nuance to our understanding of individual gaze patterns in people with and without ASD. The researchers note that data-driven machine learning approaches offer considerable promise in advancing research in this area, lending the power and efficiency needed to analyze large eye-tracking data sets.

Reference

Reimann, G. E., Walsh, C., Csumitta, K. D., McClure, P., Pereira, F., Martin, A., & Ramot, M. (2021). Gauging facial feature viewing as a stable individual trait in autism spectrum disorder. Autism Research, 14, 1670-1683. https://doi.org/10.1002/aur.2540 

Clinical Trial

NCT01031407