A team of scientists led by Karl Farrow at NeuroElectronics Research Flanders (NERF, empowered by imec, KU Leuven and VIB) is unraveling how our brain processes visual information. They identified specific roles for distinct neuronal cell types in passing on information from the eye to downstream brain regions that guide behavior. Such knowledge is essential to understand how sensory information guides our actions and decisions.
We use information about the world around us to guide our behavior. While the fluttering wings of a butterfly or a quickly approaching predator can both catch our attention, they trigger a very different behavioral response.
To get from detection to action, visual information is passed from the retina in our eye to different downstream brain regions. The nervous system consists of many different cells that work together in circuits, and understanding how these circuits relay information has puzzled researchers for decades.
The first stage of visual processing, the transfer of information from the retina, happens through a wide range of retinal cell types, each with their own typical shapes and responses. One major target of these cells is the superior colliculus, a brain area that receives approximately 85% of the retinal outputs in rodents."
Chen Li, PhD student in Karl Farrow's lab at NERF
Mapping networks
"The aim was to decipher the wiring rules that enable the brain to integrate the behaviorally relevant visual information from the retina," says Katja Reinhard, a postdoctoral researcher in Farrow's lab. To this end, the team traced the connections of >30 retinal cell types in mice, each informing one or several brain areas about a certain feature of the visual world.
"We compared the shape, molecular properties and visual response of different retinal cells innervating two pathways that pass through the superior colliculus. By tracing circuits and examining neuronal activity, we found that there is a clear preference in which retinal cell types provide input to a given circuit."
As such, the researchers deciphered a projection-specific logic where each output pathway from the superior colliculus sampled a distinct and limited set of retinal inputs. These findings suggest a mechanistic basis for the selective triggering of behaviors by the superior colliculus.
Highly specific inputs and outputs
"Earlier studies suggested that there is a large degree of fuzziness in the information each neuron receives from the retina," says Farrow. "Our data suggests there are strict limits on the degree of mixing of retinal inputs that occurs in the superior colliculus, where each output pathway has access to a distinct, only partially overlapping, set of visual information encoded by the retina."
Understanding the specific network structure in this context will greatly enhance our ability to create mechanistic models of how sensory information triggers behaviors and informs decision-making.
Sources:
Journal reference:
Reinhard, K., et al. (2019) A projection specific logic to sampling visual inputs in mouse superior colliculus. eLife. doi.org/10.7554/eLife.50697.