Spatial Perception In Hearing Is Linked With Interaural Time Difference

Published

When you hear a sound, how do your senses figure out what direction it is coming from? A sound coming from your right will reach your right ear a few fractions of a millisecond earlier than your left ear. The brain uses this difference, known as the interaural time difference (ITD), to locate the sound.

But we are also significantly better at locating sounds that come from in front of us than from sources at our sides. This may be due in part to differences in the number of neurons available to detect sounds from these different locations.

It could also reflect differences in the rates at which those neurons fire in response to sounds. Theoretical biologists speculate that we may have evolved this way because it is efficient for hunting and foraging, just as has been proposed with frontal vision1.

These factors by themselves however, don’t fully explain why humans are so much better at localizing sounds in front of them.

Neural Network Properties

Now, Rodrigo Pavão, of Albert Einstein College of Medicine, and colleagues have shown2 that the brain has evolved the ability to detect natural patterns that exist in sounds as a result of their location, and to use those patterns to optimize the spatial perception of sounds.

Pavão et al. showed that the way in which the head and inner ear filter incoming sounds has two consequences for how we perceive them. Firstly, the change in ITD for sounds coming from different sources in front of a person is greater than for sounds coming from their sides.

And secondly, the ITD for sounds that originate in front of a person varies more over time than the ITD for sounds coming from the periphery. By playing sounds to healthy volunteers while removing these differences, the researchers found that natural interaural time difference statistics were correlated with a person’s ability to tell where a sound was coming from.

By uncovering thow the brain determines the location of sounds, this work could eventually lead to the development of more effective hearing aids. The results also provide clues to how other senses, including vision, may have evolved to respond optimally to the environment.


  1. Mark A Changizi, Shinsuke Shimojo. “X-ray vision” and the evolution of forward-facing eyes. Journal of Theoretical Biology 254:756–767. ↩︎

  2. Rodrigo Pavão, Elyse S Sussman, Brian J Fischer, José L Peña. Natural ITD statistics predict human auditory spatial perception. eLife 2020;9:e51927 DOI: 10.7554/eLife.51927 ↩︎


Last Updated on November 8, 2022