Retinotopic Coding Transfers Information from Perception to Memory

Published
retinal coding

Humans have vivid memories: we can recall the layout of our house, the color of our bedroom, or the front of our favorite restaurant. Neuroscientists have long been perplexed by how the brain encodes this information.

In a recent Dartmouth-led study, researchers uncovered a neural coding process that permits information to be transferred from perceptual regions to memory sections of the brain.

“We found that memory-related brain areas encode the world like a ‘photographic negative’ in space. And that ‘negative’ is part of the mechanics that move information in and out of memory, and between perceptual and memory systems,”

said co-lead author Adam Steel, a postdoctoral researcher in the Department of Psychological and Brain Sciences and fellow in the Neukom Institute for Computational Science at Dartmouth.

Common Push-pull Code

Memory areas contain smaller pRFs compared to their paired perceptual areas.
Memory areas contain smaller pRFs compared to their paired perceptual areas. Left, group average pRF size with memory areas and perception areas overlaid. Nodes are threshold at R2 > 0.08. Right, bars represent the mean pRF size for +pRFs in SPAs (OPA, PPA) and +/−pRFs in PMAs (LPMA, VPMA). Individual data points are shown for each participant. Across both surfaces, pRFs were significantly smaller on average in the PMAs than their perceptual counterparts. Credit: Nature Neuroscience (2024). DOI: 10.1038/s41593-023-01512-3

Prior to this work, the standard theory of brain organization was that perceptual parts of the brain represent the world “as it is,” with the visual cortex of the brain representing the external world depending on how light falls on the retina, “retinotopically.

In contrast, it was previously assumed that the brain’s memory centers represent information in an abstract way, devoid of physical features. However, the co-authors argue that this theory ignores the possibility that as knowledge is encoded or recalled, various brain regions may share a common code.

In a series of experiments, participants were tested on perception and memory while their brain activity was recorded using a functional magnetic resonance imaging (fMRI) scanner. The team identified an opposing push-pull like coding mechanism, which governs the interaction between perceptual and memory areas in the brain.

The findings revealed that when light strikes the retina, visual parts of the brain increase their activity to reflect the pattern of light. Memory parts of the brain respond to visual stimuli as well, although their neural activity diminishes when processing the same visual pattern as visual areas.

Three Key Findings

The study’s co-authors reveal three unexpected findings. The first is their discovery that memory systems retain a visual coding basis.

The second is that this visual code is upside-down in memory systems.

When you see something in your visual field, neurons in the visual cortex are driving while those in the memory system are quieted,

said senior author Caroline Robertson, an assistant professor of psychological and brain sciences at Dartmouth.

Third, this relationship flips during memory recall.

“If you close your eyes and remember that visual stimuli in the same space, you’ll flip the relationship: your memory system will be driving, suppressing the neurons in perceptual regions,”

Robertson explained.

“Our results provide a clear example of how shared visual information is used by memory systems to bring recalled memories in and out of focus,”

co-lead author Ed Silson, a lecturer of human cognitive neuroscience at the University of Edinburgh, said.

Moving forward, the team intends to investigate how this push and pull dynamic between perception and memory may contribute to challenges in clinical conditions such as Alzheimer’s.

Abstract

Conventional views of brain organization suggest that regions at the top of the cortical hierarchy processes internally oriented information using an abstract amodal neural code. Despite this, recent reports have described the presence of retinotopic coding at the cortical apex, including the default mode network. What is the functional role of retinotopic coding atop the cortical hierarchy? Here we report that retinotopic coding structures interactions between internally oriented (mnemonic) and externally oriented (perceptual) brain areas. Using functional magnetic resonance imaging, we observed robust inverted (negative) retinotopic coding in category-selective memory areas at the cortical apex, which is functionally linked to the classic (positive) retinotopic coding in category-selective perceptual areas in high-level visual cortex. These functionally linked retinotopic populations in mnemonic and perceptual areas exhibit spatially specific opponent responses during both bottom-up perception and top-down recall, suggesting that these areas are interlocked in a mutually inhibitory dynamic. These results show that retinotopic coding structures interactions between perceptual and mnemonic neural systems, providing a scaffold for their dynamic interaction.

Reference:
  1. Steel, A., Silson, E.H., Garcia, B.D. et al. A retinotopic code structures the interaction between perception and memory systems. Nat Neurosci (2024). Doi: 10.1038/s41593-023-01512-3