The brain’s mechanisms for vision depend on other senses
25 May 2022
Every moment of our waking lives, images fall onto our eyes and go through a series of processing steps in the brain to inform us about what is going on in the world. A team of researchers led by HBP partner University of Amsterdam, finds that the time the brain needs to make a visual decision depends on auditory and tactile inputs, not only on the visual properties of the stimuli. This work is now published in Nature Communications (rdcu.be/cN9bb).
When images from the outside world reach our eyes, signals cascade through the visual system. Neurons across different brain areas recombine the signals to form a representation that is then used to understand the world and react to it. The primary visual cortex is the first stage in the cerebral cortex where visual information is processed. Previously, our understanding has been that this process is mainly determined by the complexity of the visual scene that needs to be processed. However, we also know that vision is just one of the senses and does not operate in isolation: there is sound, touch, smell, and other senses. The researchers therefore asked if the processing time required to make a perceptual decision was affected if other senses also needed to be monitored.
What we see is also what we hear and touch
“We compared two sets of subjects,” says Matthijs oude Lohuis, first author of the paper together with Jean Pie. “One cohort of mice was trained to report what they saw and ignore what they heard or felt on their whiskers. The other cohort was trained to report what they saw but also what they heard or felt.” The researchers found that the activity of neurons in the visual cortex varied based on whether subjects were only trained to report visual stimuli or also other modalities. Brain signals indicating that visual stimuli had been detected took longer to appear if mice were also paying attention to sounds or touch. “We hypothesized that a longer duration of processing in the primary visual cortex would also indicate a longer involvement of this region in the transformation of sensory stimuli into appropriate motor responses,” says Umberto Olcese, a senior researcher in the team. To test this hypothesis, the researchers used a molecular tool called optogenetics, which enabled them to turn the visual cortex on or off by illuminating it with a highly specific laser beam.
As expected, activity in the visual cortex was necessary for detecting visual stimuli. Surprisingly, however, the time period during which visual cortex was necessary for detection depended on whether mice were also paying attention to other sensory modalities: a task requiring to monitor another sensory modality prolonged the time during which activity in the visual cortex was needed to detect a visual stimulus, independently from the features of the images to be processed.
Towards an integrated view of processing in the brain
The study challenges our current understanding of how vision works. While we are used to thinking that visual processing occurs in isolation, the way the brain analyzes images hitting our eyes is already influenced in the early stages by other factors, including other sensory modalities and how we use sensory information.
This multimodal view of sensory processing underlies a theory that one of the team members, Prof. Cyriel Pennartz, previously proposed on brain mechanisms of perception and consciousness. This theory states that conscious processing is jointly shaped by multiple senses in an overarching framework that characterizes perception as the construction of best-guess representations of our surroundings, and has given rise to computer models built in HBP (See A robot on EBRAINS has learned to combine vision and touch).
“The results contribute to testing the theory and they support our main goal in the HBP, to develop computational models of object recognition in which different sensory modalities – such as vision and touch – converge into a unified representation generated by our brain,” says Pennartz.
”It’s a major step towards developing an integrated view of how different cortical regions jointly process the diverse and multimodal information which is constantly facing our senses.”
The modelling experiments can be accessed and run on the Neurorobotics Platform of the HBP’s EBRAINS research infrastructure.
Read the full paper in Nature Communications:
Multisensory task demands temporally extend the causal requirement for visual cortex in perception