Social Neural Signaling Suppressed During Zoom Conversations

Published
zoom neural suppression

When Yale neuroscientist Joy Hirsch utilized sophisticated imaging technologies to watch the brain activity of two people conversing in real time, she uncovered a complicated choreography of neural activity in parts of the brain that govern social relationships. When she conducted comparable trials with two people communicating on Zoom, the ubiquitous video conferencing platform, she discovered a very different neural landscape.

Neural signaling during online exchanges was substantially suppressed compared to activity observed in those having face-to-face conversations, researchers found.

“In this study we find that the social systems of the human brain are more active during real live in-person encounters than on Zoom. Zoom appears to be an impoverished social communication system relative to in-person conditions,”

said Hirsch, the Elizabeth Mears and House Jameson Professor of Psychiatry, professor of comparative medicine and neuroscience, and senior author of the study.

Interactive Face Encoding

All human societies are built on social interactions, and according to researchers, our brains are adapted to process dynamic facial cues, which are a primary source of social information during in-person interactions.

While the majority of earlier studies that used imaging techniques to monitor brain activity during these interactions focused on single subjects, Hirsch’s lab created a special suite of neuroimaging technologies that enable them to observe, in real-time, interactions between two people in natural environments.

Hirsch’s team observed neural system responses in people having two-person conversations on Zoom, a well-liked video conferencing platform that millions of Americans use every day.

Better Facial Processing

They found that the strength of neural signaling was dramatically reduced on Zoom relative to “in-person” conversations. Increased activity during face-to-face talks was associated with increased gaze time and pupil diameters, indicating higher arousal in the two brains. According to the researchers, increased EEG activity during in-person encounters was indicative of improved facial processing capacity.

Furthermore, the researchers discovered more coordinated neural activity between the brains of individuals conversing in person, implying an increase in reciprocal exchanges of social cues between the interacting partners.

“Overall, the dynamic and natural social interactions that occur spontaneously during in-person interactions appear to be less apparent or absent during Zoom encounters. This is a really robust effect,”

Hirsch said. According to Hirsch, the findings demonstrate the importance of face-to-face interactions in our natural social behaviours.

“Online representations of faces, at least with current technology, do not have the same ‘privileged access’ to social neural circuitry in the brain that is typical of the real thing,”

she said.

Abstract

It has long been understood that the ventral visual stream of the human brain processes features of simulated human faces. Recently, specificity for real and interactive faces has been reported in lateral and dorsal visual streams raising new questions regarding neural coding of interactive faces and lateral and dorsal face processing mechanisms. We compare neural activity during two live interactive face-to-face conditions where facial features and tasks remain constant while the social context (in-person or on-line conditions) are varied. Current models of face processing do not predict differences in these two conditions as features do not vary. However, behavioral eye-tracking measures showed longer visual dwell times on the real face and also increased arousal as indicated by pupil diameters for the real face condition. Consistent with the behavioral findings, signal increases with functional near infrared spectroscopy, fNIRS, were observed in dorsal-parietal regions for the real faces and increased cross-brain synchrony was also found within these dorsal-parietal regions for the real in-person face condition. Simultaneously acquired electroencephalography, EEG, also showed increased theta power in real conditions. These neural and behavioral differences highlight the importance of natural, in-person, paradigms and social context for understanding live and interactive face processing in humans.

Reference:
  1. Nan Zhao, Xian Zhang, J. Adam Noah, Mark Tiede, Joy Hirsch. Separable Processes for Live “In-Person” and Live “Zoom-like” Faces. Imaging Neuroscience 2023; doi: 10.1162/imag_a_00027