New research sheds light on how the brain combines external information and internal memory to build a sense of touch.
The research could help scientists understand how to better treat strokes and autism spectrum disorder.
When you touch something, whether stepping onto a sandy beach or stroking the back of a dog, sensations fly into your brain. You feel the coarse grain of the sand under your feet, the fluffiness of the fur on your hand.
But you also bring a bit of yourself into the feeling: Along with the external stimulation from the beach or pup, there’s the memory of past moments—toweling sand from your toes during a summer vacation, snuggling with a much-missed family pet. We all agree that something feels abrasive or soft, but interpret that sensation slightly differently.
“When we perceive our environment, we’re actually doing two things,” says Boston University neurobiologist Jerry Chen, an expert on cognitive function. “We’re taking in all the senses, all the physical elements of the world; at the same time, we are applying our own types of inference, subjective interpretation of what we think we’re perceiving.”
In a new study in Science, Chen illuminates that process. Looking at mouse brains, Chen and a team of researchers discovered a circuit in the primary somatosensory cortex—the part of the brain that receives signals related to touch, temperature, and pain—that’s dedicated to computing tactile information. He says the circuit helps the brain figure out how to balance the stimulation coming from outside the body with existing knowledge.
The study may be significant for our understanding of a range of neurological disorders and neuropsychiatric diseases that can alter sensory perception, from strokes to autism spectrum disorder. Improved knowledge of the brain’s circuits, says Chen, an assistant professor of biology, may pave the way for more targeted treatments and interventions.
As part of their dive into the brain’s workings, the researchers developed a new method for surveying and watching cells: a platform that generates activity in the brain, shows the molecular composition of the firing cells, and helps compute all of the data. It allowed Chen to look at how different neurons in the cortex reacted and communicated when an animal touched an object—and how those neurons adapted when something in the environment shifted.
Chen and his team used the Allen Institute’s atlas of the mouse brain—a catalog of the different types of brain cells—as a starting point for the project. Chen says the atlas is great for pinpointing the location and category of a neuron, but it doesn’t really tell researchers much about the neuron’s functions. His findings bring that detail and color.
“It’s another level of understanding for how everything fits together,” says Chen. “The biggest thing is that we’ve married the catalog with the functional definition—that’s really going to open up a lot of ways for us to understand the brain.”
Here, Chen explains his findings and their potential for improving our knowledge and care of the brain:
Support for this research came from a National Institutes of Health New Innovator Award, the Brain Research Through Advancing Innovative Neurotechnologies (BRAIN) Initiative, and the National Institute of Mental Health.
Source: Boston University