Our vision and hearing aren't as reliable as we might think, according to a study by life scientists at UCLA.
"Our basic sensory representation of the world -- how information from our eyes and ears is processed by neurons in the brain -- is inaccurate," said Ladan Shams, an associate professor of psychology in the UCLA College and senior author of the research, which was published today in the journal PLOS Computational Biology.
"We tend to view our senses as flawless and think that to see is to believe," she said. "So it's eye-opening to learn that our perceptions are flawed."
Shams and her colleagues conducted the research in part because there had never been a comprehensive study to examine whether humans' 'spatial localization' ability -- that is, whether we can immediately and accurately perceive where an object is located -- is as well-honed as we believe it to be.
In the study, subjects were asked to sit facing a black screen, behind which were five loudspeakers. Mounted on the ceiling above was a projector capable of flashing bursts of light onto the screen, at the same spots where the speakers were located.
The scientists played brief bursts of sound and triggered flashes of light, in various combinations, and asked participants to identify where they originated. A total of 384 people, most between the ages of 18 and 22, participated; they typically were asked to identify about 525 stimuli during a 45-minute test.
In general, they fared poorly when the light and sound were played alone. Participants mostly believed that the light sources were closer to the center of the screen than they actually were, and that noises were coming from closer to the periphery.
"The auditory task was especially difficult," said Brian Odegaard, a UCLA postdoctoral scholar who was the study's lead author.
The scientists were surprised by the results.
"We didn't expect these spatial errors; they're very counterintuitive," Shams said. "Spatial localization is one of the most basic tasks the brain performs, and the brain does it constantly."
What's more, she said, because the ability is shared with lower animals, logic would suggest that millions of years of evolution would have perfected spatial localization in humans. But that's not the case. Shams isn't sure why, but one hypothesis is that the brain makes constant tradeoffs to best use its finite capacity.
"Maybe evolution has favored high precision in the center of the visual field," she said. "We are really good at localizing and discriminating at high acuity in the center of our vision, and that comes with the cost of making more errors at outer areas."
The study participants did, however, answer much more accurately when the flashes and noise were played simultaneously at the same location.
"The brain is wired to use information from multiple senses to correct other senses," Shams said. "The saying is true: 'If you want to hear better, put your glasses on.'"
Odegaard said the study was the largest to date on sensory biases. Its findings could have applications in a range of fields, from the military -- where minute errors in identifying enemy locations can be critical -- to automobile safety. The UCLA research suggests that drivers can see the cars in front of them very well, but would have difficulty estimating the distance between themselves and vehicles to the left and right. Shams said driverless cars could be engineered to eliminate that deficiency.
Shams, whose laboratory is funded by the National Science Foundation, is also studying whether research on multisensory perception can help people with autism, schizophrenia and other disorders.