VLPFC Neurons Help in Processing Facial and Speech Information

Published
two children processing facial and vocal cues

A brain region associated with multisensory integration and working memory may also be significantly involved in the processing of social cues, according to recent research.

Prior studies have established that neurons located in the ventrolateral prefrontal cortex (VLPFC) are responsible for integrating facial and vocal stimuli. However, this new study demonstrates that VLPFC neurons also exhibit the ability to interpret the facial expressions and vocalizations of the “speaker” in addition to their own identity.

“We still don’t fully understand how facial and vocal information is combined and what information is processed by different brain regions. However, these findings confirm VLPFC as a critical node in the social communication network that processes facial expressions, vocalizations, and social cues.”

said senior author Lizabeth Romanski, associate professor of neuroscience at the Del Monte Institute for Neuroscience at the University of Rochester.

Testing Identity and Expression Perception

The ventrolateral prefrontal cortex is a region of the brain that is larger in primates such as humans and macaques. The Romanski Lab showed rhesus macaques brief videos of other macaques engaging in friendly, hostile, or neutral vocalizations/expressions in this study.

The researchers recorded the activity of more than 400 neurons in the VLPFC and discovered that the cells did not demonstrate strong categorical responses to the macaque expressions or identities in the movies.

But when the researchers put the neurons together as a pseudopopulation, they were able to teach a machine learning model to figure out the expression and identity in the movies just by looking at the patterns of neural activity. This suggested that the neurons were responding to these traits as a group.

Key Social Cue Region

Overall, the identity of the macaque in the video largely dictated the activity of the population of VLPFC neurons. These findings suggest that the VLPFC is a key brain region in the processing of social cues.

“We used dynamic, information-rich stimuli in our study and the responses we saw from single neurons were very complex. Initially, it was difficult to make sense of the data. It wasn’t until we studied how population activity correlated with the social information in our stimuli that we found a coherent structure. For us, it was like finally seeing a forest instead of a muddle of trees,”

said Keshov Sharma, lead author of the study. Sharma and Romanski hope their approach will encourage others to analyze population-level activity when studying how faces and voices are integrated in the brain.

The Romanski lab is focused on understanding how the prefrontal cortex processes auditory and visual information. This mechanism is important for distinguishing objects by sight and sound, as well as for effective communication.

Speech and Communication Disorders

The Romanski Lab previously identified the VLPFC as a brain region involved in retaining and integrating face and speech information during working memory. This collection of studies suggests that this brain region is important within the wider network that underpins social communication.

“Knowing what features populations of neurons extract from face and vocal stimuli and how these features are typically integrated will help us to understand what may be altered in speech and communication disorders, including autism spectrum disorders, where multiple sensory stimuli may not combine optimally,”

Romanski said.

The Schmitt Program for Integrative Neuroscience from the Del Monte Institute for Neuroscience, the University of Rochester Medical Scientist Training Program, and the National Institutes of Health provided funding for the study.

Abstract

The ventrolateral prefrontal cortex (VLPFC) shows robust activation during the perception of faces and voices. However, little is known about what categorical features of social stimuli drive neural activity in this region. Since perception of identity and expression are critical social functions, we examined whether neural responses to naturalistic stimuli were driven by these two categorial features in the prefrontal cortex. We recorded single neurons in the VLPFC, while two male rhesus macaques (M. Mulatta) viewed short audiovisual videos of unfamiliar conspecifics making expressions of aggressive, affiliative, and neutral valence. Of the 285 neurons responsive to the audiovisual stimuli, 111 neurons had a main effect (two-way ANOVA) of identity, expression or their interaction in their stimulus related firing rates; however, decoding of expression and identity using single unit firing rates rendered poor accuracy. Interestingly, when decoding from pseudopopulations of recorded neurons, the accuracy for both expression and identity increased with population size, suggesting that the population transmitted information relevant to both variables. Principal components analysis of mean population activity across time revealed that population responses to the same identity followed similar trajectories in the response space, facilitating segregation from other identities. Our results suggest that identity is a critical feature of social stimuli that dictates the structure of population activity in the VLPFC, during the perception of vocalizations and their corresponding facial expressions. These findings enhance our understanding of the role of the VLPFC in social behavior.

Reference:
  1. KK Sharma, MA Diltz, T Lincoln, ER Albuquerque, LM Romanski. Neuronal Population Encoding of Identity in Primate Prefrontal Cortex. Journal of Neuroscience 14 November 2023, JN-RM-0703-23; DOI:10.1523/JNEUROSCI.0703-23.2023