The Lateral Intraparietal Area: Spotting A Friendly Face In The Crowd

Published
crowd of people

If a friend is looking for you in a crowd, think of the clues you might offer: “I’m wearing a red shirt,” or “I’m back by the railing,” or “I’m standing up waving my hand.”

It helps to focus on these details to narrow down the search. We can’t process everything we see all at once, and it would take forever to scan a large crowd one face at a time. Neuroscientists call this attention—attending to specific features or positions in your field of vision primes the brain to notice them more quickly.

In the brain, some areas of the visual cortex specialize in catching non-spatial features, like color, shape, or orientation; others specialize in extracting spatial cues such as direction of movement or the position of an item. The brain has to assemble these various inputs to make decisions about objects in our field of vision.

According to new research, both space-based and feature-based attention influence each other to help us find what we’re looking for, whether it’s a stoplight turning red or a friend waving a hand in the crowd.

Color And Motion

In 2014, David Freedman, professor of neurobiology at the University of Chicago, and Guilhem Ibos, a postdoctoral scholar in Freedman’s lab, identified a brain region called the lateral intraparietal area (LIP) that assembles and processes visual information based on feature-based attention—things like color and motion. In that study, they shed light on a unique characteristic of neurons in LIP and how they respond to visual stimuli.

Individual neurons shift their selectivity to color and direction depending on the task at hand.

In experiments with monkeys, if the subject was looking for red dots moving upward, for example, a neuron would respond strongly to movement similar to upward motion and to colors close to red. If the task switched to another color and direction seconds later, that same neuron would be more responsive to the new combination.

Ibos and Freedman added space-based attention to the mix in the latest study. Monkeys were again trained to look for dots of a certain color moving in a certain direction, but this time they had to be in a specific area of the display too.

So instead of just looking for red dots moving upward, they had to look for red dots moving upward in the top right corner of the screen, and ignore stimuli located elsewhere.

Flexible Response

Individual neurons are tuned to different areas of visual space (called receptive-fields). If a given image is divided up into squares, each neuron would respond to a different square.

In the experiments, Ibos and Freedman recorded the activity of individual neurons as the monkeys performed the tasks, and saw that attention to the spatial position of a stimulus strongly influenced the feature selectivity of LIP neurons.

If the subject was looking for red dots moving upward in the receptive field of the recorded neuron, the effects of feature-based attention on LIP neurons were larger than if subjects attended the same stimuli located outside its receptive field.

“It’s one of the first times that both kinds of attention have been looked at together,” says Freedman. “This particular part of the brain [LIP] seems to be very flexible, changing what it responds to depending on what it is you’re looking for at that moment.”

The results appear to show that the LIP integrates attention-based inputs from earlier in the brain’s visual processing system. For example, an area in the visual cortex called V4 responds strongly to colors and other features, and an area called MT responds to direction of movement.

The Lateral Intraparietal Area

The LIP puts these together to help the brain find what it’s looking for based on color, direction of movement, and position in space.

“It’s more like the LIP reads out the activity of the visual cortex and uses that to make a decision,” Ibos says. “It’s not really the place where the attention is created, it’s more the place where the attention-based neuronal signal is processed.”

Ibos says the next step is understanding how the LIP assembles these pieces of information, and where it fits into the larger decision-making process.

“Why does the LIP integrate these signals? How does it cooperate with other cortical areas? We believe that it plays an important role in detecting whether the stimulus was a target or was not a target,” he says. “The next part of the study is understanding how the LIP integrates, combines, and computes this kind of information in order to make decisions.”

Guilhem Ibos, David J. Freedman
Interaction between Spatial and Feature Attention in Posterior Parietal Cortex
Neuron; DOI: http://dx.doi.org/10.1016/j.neuron.2016.07.025

Related Posts:

 

Last Updated on December 29, 2022