Imagine the common scenario known as the cocktail party effect. You're in a loud and crowded room, trying to concentrate on what the person next to you is saying.
As you strain to hear the spoken words, you're using your eyes as well as your ears. You watch the shape of your conversation partner's mouth for clues of what they'll say next, like a pursed mouth hinting of an upcoming "p" or a "b" sound.
"The brain combines what you're hearing with what you're seeing to get the best sense of the world," said Ross Maddox, a postdoctoral fellow at the University of Washington's Institute for Learning & Brain Sciences.
At I-LABS, Maddox works in the Laboratory of Auditory Brain Sciences & Neuroengineering, which is directed by Adrian KC Lee, an assistant professor in the UW department of speech and hearing sciences. Their work is part of the Institute's innovative brain and behavior research that aims to improve the quality of lives of people everywhere.
Maddox and Lee's latest research paper, published February 5 in the open-access journal eLife, shows how the timing of visual and auditory information is an essential component to how vision improves people's ability to perceive sounds.
The Experiment
Participants in the experiment listened to tones through headphones while concentrating on a changing visual stimulus displayed on a computer screen in front of them. They had to ignore one of the two tones and listen for the other, and then press a button whenever they heard the pitch of the target tone change.
Meanwhile, a white circle was on a black computer screen. The radius of the circle grew and shrank over the 14-second trials, making it look like the circle was wobbling – somewhat like a mouth speaking.
In some trials, the radius change coincided with the changing loudness level of the target tone, serving as a sort of visual cue for the participant. In other trials, the radius changed sizes with the other tone's loudness. And then in a third type of trials, the flexing of the circle's radius had nothing to do with either of the tones.
A photo of the experimental set-up is below, and videos of the stimuli are available in the research paper.
Photo caption: Participants in the experiment watched a pulsing gray disc on a black computer screen while listening to sounds through earphones. They pressed a button when they heard a brief pitch change in the target sound or when they saw a brief color change. Credit: Institute for Learning & Brain Sciences.
The Results
Maddox estimated the participants got about a 5 percent boost in accuracy when the visual and auditory stimuli were presented in sync.
"This is enough of an improvement to show that matching the timing of the audio and video stimuli helps with listening," he said.
The project was funded by the National Institutes of Health, the Hearing Health Foundation, the Wellcome Trust, the Royal Society, and Action on Hearing Loss. Other co-authors are Huriye Atilgan and Jennifer Bizley at The Ear Institute, University College London.
What's Going On In the Brain?
Maddox's study provides evidence that the brain makes a unified multisensory impression of a sound stimulus. This helps explain how at a noisy cocktail party we're able to combine the sight of a person we're talking to with what they're saying and then tune out background noise.
The hearing boost wouldn't be there if, say, we took a phone call in the midst of the party. Without seeing the person on the phone, it's harder to filter the voice on the phone from the din in the crowded room.
Hearing aids that combine auditory and visual information are some years away, Maddox said. But still, his work contributes to mounting evidence that hearing aids of the future can't just be about hearing. They'll have to integrate visual information too.
"The brain is creating objects – it sees a head with a moving mouth and hears a voice. It assigns the voices to faces and locks them together," Maddox said. "This is essentially how we think the brain listens using both sights and sounds. So why not make hearing aids do the same?"
Future Applications
The research also has implications for how to improve computers, including voice-command applications.
While the brain combines sights and sounds "almost effortlessly," Maddox said, "we haven't been able to do this well with computers."
Now, with a new Pathway to Independence Award from the National Institutes of Health, Maddox will use I-LABS' brain-imaging tools to see which brain areas are involved with the visual system's enhancement to hearing.
"The Pathway to Independence grant is one of the most prestigious awards for early-career scientists," Lee said. "It's a testament to the creativity and innovativeness of his research and the potential impact it has to benefit lives that Ross has received this honor.”
In the new brain-imaging studies, Ross hypothesizes that brain regions involved in vision, attention, multisensory areas in the cortex, and possibly areas of the frontal cortex involved in executive control will have a role in the boost the visual system gives to hearing.
He'll do the research in people who are healthy listeners but eventually he hopes to test people who are of normal hearing but have trouble listening in noisy situations.
It's "completely amazing" how the brain understands sounds, Maddox said, and how the "mushy computer in our heads is way better" than any computerized device.
Figuring out the basic science of how the brain listens will eventually help not just healthy listeners struggling to keep up with the conversation, but also people who have hearing disabilities.
"Most of us take our ability to tune into one sound in a crowded setting for granted," Maddox said. "Having the technology to spy on what the brain is doing while it solves this complicated problem could lead to exciting innovations in basic neuroscience as well as new technologies."
###