Behavioral Signals Affect All Visual System Processing Stages

Published
Functional tuning maps to individual object dimensions

The conventional view in research has been that the primary objective of human perception is to distinguish objects and assign them to distinct categories — for example, this seen object is a dog, and dogs are classified as animals. Researchers from the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig and Justus Liebig University Giessen, together with the National Institutes of Health in the United States, have recently demonstrated that this notion is incomplete.

A recent study demonstrated that brain activity when seeing objects can be explained much better by a variety of behaviorally relevant dimensions.

“Our results have shown that recognition and categorization are important goals of our vision, but by no means the only ones,”

said last author Martin Hebart, group leader at the Max Planck Institute and professor at Justus Liebig University.

Behaviorally Relevant Dimensions

Until now, it was thought that our brain’s visual system disassembled the objects we saw into very basic features and then gradually reassembled them in order to recognize them.

“In fact, we find behaviorally relevant signals at all processing stages in the visual system. We were able to show this based on the behaviorally relevant dimensions we had previously discovered,”

Hebart said.

The researchers used a computer model to identify 66 object dimensions from behavioral data of more than 12,000 study participants. These dimensions not only explain categorization, i.e., whether a dog is an animal, but also cover other characteristics, such as colors and shapes, as well as gradual values, for example, how typical a dog is of an animal.

This allowed the team to explain much more clearly how our brain allows us to perceive and understand the objects in our environment.

Similarity Embedding

An fMRI encoding model of object dimensions underlying human similarity judgements
An fMRI encoding model of object dimensions underlying human similarity judgements. Credit: Nat Hum Behav (2024). Doi: 10.1038/s41562-024-01980-y CC-BY

First author Oliver Contier looked at the data of three study participants whose brain activity was measured in the MRI scanner over 15 sessions while they looked at more than 8,000 different images of 720 objects.

“When the participants saw a rocket, for example, we were able to measure from the brain activity that their visual system not only recognized that it was a rocket or that a rocket is a vehicle, but also that it is gray and elongated, has to do with fire, can fly, or sparkles. All processing stages of our perceptual system are therefore involved in capturing a broad spectrum of behaviorally relevant properties that together make up our perception,”

said Contier.

This work reveals a multidimensional framework that is consistent with objects’ diverse behavioral relevance.

“This ultimately explains our broad range of human behaviors better than the categorization-focused approach, and this in turn is crucial for understanding how we perceive and interact with our visual world in a meaningful way,”

adds Hebart.

Open access funding for the project was provided by Max Planck Society.

  1. Contier, O., Baker, C.I. & Hebart, M.N. Distributed representations of behaviour-derived object dimensions in the human visual system. Nat Hum Behav (2024). Doi: 10.1038/s41562-024-01980-y