Two forms of sensory information, the feel of an object and the position of your body in space, have long been thought to flow into the brain via different channels and be integrated in sophisticated processing regions.
Now, with help from a specially devised mechanical exoskeleton that positioned monkeys’ hands in different postures, Johns Hopkins University researchers have challenged that view.
In a recent paper they present evidence that the two types of information are integrated as soon as they reach the brain by sense-processing brain cells once thought to be incapable of such higher-order thought.
Past studies have indicated that the feel of an object against the skin and position information of the hands and fingers were processed separately in the sensory system’s first-line sense processors, and then passed along to more sophisticated brain regions to be integrated.
But it was a challenge to reliably differentiate brain activity caused by the two inputs from one another and from the brain’s commands to muscles, says Manuel Gomez-Ramirez, Ph.D., an assistant research scientist at Johns Hopkins.
Manipulating Monkey Hands
To solve that problem, Steven Hsiao, Gomez-Ramirez’s late mentor, and his colleagues developed a machine that positions a monkey’s hand and delivers stimuli to its fingers.
In the experiment, then-graduate student Sung Soo Kim, Ph.D., now a research specialist at the Howard Hughes Medical Institute, trained monkeys to perform an unrelated visual task while their hands were manipulated by the machine, which moved their fingers slightly from side to side and up and down at precise angles.
The machine also pressed a plastic bar to the monkeys’ fingertips in different orientations. By monitoring the monkeys’ brains in real time, the research group saw that the position and touch information were conveyed through the same cells in the somatosensory cortex.
“This study changes our understanding of how position and touch signals are combined in the brain,” says Gomez-Ramirez.
The findings could be used in efforts to better integrate prostheses with patients’ brains so that they behave more like natural limbs, he notes.
“Holding objects and exploring the world with our hands requires integrating many sensory signals in the brain and continuously supplying this information to motor areas so that it can issue the appropriate commands for holding objects,” Gomez-Ramirez says. “Our understanding of how these processes occur is very limited, and Steve Hsiao spent a lot of time thinking about the problem and figuring out how to test it.”
Sung Soo Kim et al.
Multimodal Interactions between Proprioceptive and Cutaneous Signals in Primary Somatosensory Cortex.
Neuron, April 2015 DOI: 10.1016/j.neuron.2015.03.020
Photo: Heidi Cartwright, Wellcome Images, Creative Commons by-nc-nd 4.0
Like This Article? Sciencebeta has a free 3 times weekly digest of the most interesting and intriguing articles in psychology, neuroscience, neurology, and cognitive sciences. Want to give it a try? Subscribe right here