March 7, 2008
Science of hearing loss moving near speed of sound
Sure, Dylan's voice is throaty and growly, but he can articulate his songs as well as anyone. I know it from his CDs. At a distance of more than 50 yards from the stage, however, the form of his words, especially the higher frequencies of his consonants, are lost in the hurricane of sound from the mighty speakers and the reverberations of the theater's vast interior.
If I were a hearing-impaired person, I might switch my hearing aid to the "T" setting and get the best of Bob with sound of near-studio CD quality, while the hearing-normal folks around me were still worrying about those frogs and ducks.
At least I could do that if the concert were taking place at the DeVos Performance Hall in Grand Rapids, Mich., or the Rialto Cinemas Lakeside in Santa Rosa. These are among the growing number of facilities equipped with induction loop systems that broadcast directly from the output of instruments and microphones to hearing aids equipped to receive them.
Science is ringing in a new era in the world of the hearing-impaired, and the technologies to accommodate, treat and prevent hearing loss -- and even cure it -- are advancing at almost sonic speed. And that's welcome news, considering how doctors are wringing their hands over study after study predicting hearing loss for a generation that seems constantly connected, almost from birth, to MP3 players.
Age is the major cause of hearing loss, and your level of predisposition to it is genetic. If your parents lost hearing with age, it's likely you will, too. Age-related hearing loss is found in about one-third of people older than 60 and half of people older than 80.
The hearing loss comes from breakdown of the "hair cells" in the cochlea (the spiral-shaped part of the inner ear containing the auditory nerve endings). There are some 16,000 to 20,000 of them in each ear, and each hair cell is believed to have about 100 tiny hairs known as stereocilia. Sound causes pressure variations in the cochlea, which makes the stereocilia vibrate and send impulses to the brain.
First lost with a combination of age and excessive sound -- like loud music from your iPod -- are the hair cells that resonate to higher frequencies, which is why speech becomes harder to understand as the ability to hear the higher frequencies of consonants disintegrates. But that could change, thanks to research at the House Ear Institute in Los Angeles.
"We're developing totally implantable hearing aids, so we may some day implant a device allowing a person to hear without having an [external] aid," said John House, the institute's president.
Within five years, such hearing aids should be sufficiently developed that the institute will be implanting them in children on a regular basis, House said
Among other projects at the House Ear Institute are hearing aids that completely bypass the ear and wire sound directly into the brain.
"Many years ago, we developed a process of implanting electrodes directly applied to the surface of the brain in the area of what's called the cochlear nucleus, where the nerve attaches to the brain," House said. "Now we're working on penetrating electrodes, not only to go on the surface but to penetrate the surface of the brain. That's called a penetrating auditory brainstem implant. We've implanted eight or 10 now."
Founded in 1946 by House's father, Dr. Howard P. House, to advance hearing science and improve the quality of life, the institute is among the world's leading centers of hearing-related research and education. The connection of hearing to cognitive function is taken seriously.
"We're now screening newborns for hearing loss, something that was developed at the House Ear Institute," House said, "identifying them at birth, rather than waiting until parents notice they're not responding at the age of 1, or 2 or 3 years old."
Besides surgical and other medical interventions, the institute also does research and development on hearing technologies, such as signal processing algorithms applicable in hearing aids. Hybrid electrical-mechanical hearing aids are another example: The cochlea is implanted with an electrode for high frequencies associated with subtle voice recognition, while standard hearing aids provide the lower frequencies.
"We've been working on that for the past five years, and it's still investigational," House said. "Maybe within five years, we'll be able to implant hybrid [hearing aids] in people who have residual hearing but not enough to understand speech."
The hearing aids of today are digital, with increasingly powerful microchips and software that can customize to the individual's frequency spectrum of hearing loss and can adjust to the conditions of a noisy gymnasium or a quiet garden. Hearing aids are moving toward unprecedented abilities to convey subtleties like footsteps in another room, the sound of the wind, the nuances in music or the voices of loved ones.
"Analog tends to amplify everything, whereas with digital we can actually tune it to respond only to whatever frequencies are needed, and we can insert different programs so the hearing aid can respond to different sounds," House said. "There have been tremendous advances in hearing aids in the last five years."
One example, House said, is that of multiple-microphone hearing aids available just in the last two or three years, which help filter out background noise and tune in on close-up speech. The newest, he said, available in just the two or three years, is the "open-fit" hearing aid. It uses a narrow and inconspicuous tube to carry amplified higher frequencies into the ear, leaving the ear largely open to a natural inflow of the lower frequencies without completely plugging the ear like the traditional ear mold.
"In the past, hearing aids couldn't do that because they couldn't eliminate the squeal, the feedback problem," House said.