Researchers have identified two biomarkers for measuring our ability to follow conversations in noisy environments, an early sign of hearing loss
Hidden hearing loss is one of the most common hearing disorders affecting people, limiting their ability to hear conversations in noisy environments like a restaurant. The condition is also the hardest to diagnose in the clinic, because conventional hearing tests called audiograms don’t measure this type of hearing damage. A study published earlier this year aims to lead to a new diagnostic test for hidden hearing loss.
Researchers at the Eaton-Peabody Laboratories at Mass. Eye and Ear have identified two biomarkers of brain function —one that represents listening effort, and another that measures the ability to process rapid changes in sound frequencies— that may provide new ways to objectively measure hidden hearing loss. Their study was published January in the journal, eLife, and could inform the design of future hidden hearing loss tests given to patients at the clinic.
Hidden hearing loss a major issue likely on the rise
The World Health Organization estimates that a billion young adults are at risk for hearing problems due to prolonged exposure to high levels of noise. Currently 48 million Americans are affected by hearing loss, and these numbers are expected to rise with the aging population. The first symptoms of hearing loss may present as an inability to follow a single speaker in crowded places such as restaurants. According to Dr. Polley, such damage occurs with normal aging but can be accelerated by exposure to high volume sounds from personal devices or other loud environmental noises.
“There are more people; there are more mechanized devices; there’s more background noise,” senior study author Daniel B. Polley, PhD, Director of the Lauer Tinnitus Research Center at Mass. Eye and Ear, told AARP. “Our longevity and the amount of noise in the environment have combined to create this perfect storm of difficulty.”
In 2009, a team of Mass. Eye and Ear investigators uncovered a new type of inner ear damage that could explain these poorly understood, but common, hearing complaints. They showed that noise and aging first damage the synapses, which connect the hair cells to the nerve fibers and ultimately, carry neural signals to the brain. This hearing damage was called cochlear synaptopathy, and since hair cell function is what audiograms measure, this synaptic loss can’t be measured, inspiring the popular term ‘hidden hearing loss.’
In the new study, the researchers combed through over 100,000 records from visitors to the Mass. Eye and Ear hearing clinics and concluded that roughly 10 percent of visitors seeking care may have this form of hidden hearing loss: they arrived with a primary complaint of poor hearing, particularly in noisy environments, but when they received an audiogram, the results appeared normal.
“Our study was driven by a desire to develop new types of tests that reveal the problems that are missed by audiograms,” said lead study author Aravindakshan Parthasarathy, PhD, a postdoctoral fellow in the Eaton-Peabody Laboratories at Mass. Eye and Ear.
Measures of Brain Processing May Reveal Signs of Hidden Hearing Loss
The researchers designed three sets of tests and enrolled 23 young and middle-aged volunteers in the study.
The first test looked at how sensitive people were to the rise and fall of sound wave frequencies – the earliest stages of sound processing — by measuring electroencephalogram (EEG) signals from the surface of the ear canal. Their ability to detect this stimulus accurately predicted their performance on a test of understanding speech with multiple talkers. A second test asked subjects to listen to the same sounds and report the smallest difference in sound frequency change that they could perceive.
A third test measured listening effort, or the amount of cognitive resources needed to perceive and understand speech when multiple people are talking. Participants wore specialized glasses to measure changes in pupil diameter as study participants focused their attention on one speaker while others talked in the background. That test was based on previous research, which shows changes in pupil size can reflect the amount of cognitive effort expended on a task.
Their subjects all had normal hearing according to their clinical tests but – as expected – varied widely in their ability to understand sentences spoken with other talkers in the background. By combining these three tests, the researchers were able to account for nearly 80% of the variability observed in how well they could understand speech with multiple talkers in the background.
“This is a major improvement over audiograms and other common clinical measures of speech in silence, which account for roughly 0%, of this variability,” said Dr. Polley
The researchers said they encouraged by these results. They believe they indicate a need to shift clinical testing provided in hearing clinics to include tests of how the brain processes sound, in addition to standard tests of how the ear is functioning. More research is needed before these tests can be used as a diagnostic tool, and the team is pursuing further studies to focus on implementing these tests in the clinic.
“The biomarkers identified here do not require specialized equipment or marathon measurement sessions. In theory, they could be implemented into most hospital hearing clinics,” said Dr. Parthasarathy