MELBOURNE, AUSTRALIA — Researchers at the BABILab, part of the Bionics Institute in Melbourne, are mapping the brains of babies to determine whether or not hearing aids or cochlear implants are helping to develop language centers in the brain. Sensors placed in skullcaps worn by babies engaged in auditory and language tasks provide a unique way to evaluate brain activity in young children.
The scans obtained can then be compared to those from infants with normal hearing. Data from the scans can help dictate whether or not adjustment of amplification is required.
Current hearing tests of children, such as otoacoustic emissions (OAE) and auditory brainstem response (ABR), are an “approximate best guess” said Professor Colette McKay, head of translational hearing research at the institute.
“When a baby is born, their brain is ready to develop all the structures that support language development and speech perception and production. You need to have the auditory input in order to develop that structure and make it work. The earlier you intervene and give them sound, the better outcome for their whole quality of life.” –Prof. Colette McKay
Scanning Brain for Hearing Loss
To measure the brain’s response to sound, researchers at BABILab are using functional near-infra-red spectroscopy (fNIRS). This method, a light-based technology, was chosen over EEG due to concerns the EEG could receive electrical interference from a cochlear implant system.
The fNIRS measures oxygenated blood, correlating with various brain regions that become activated when certain sounds are heard or understood by the child.
The tests will allow researchers to look across various networks of the brain that are contributing to language development.
“These language areas in the brain should be highly connected. If they’re not, we could be able to target them with some sort of speech and language therapy. Or if a child was wearing a hearing aid and their language wasn’t developing, it could be a sign they should try a cochlear implant instead.”
Professor McKay and her team are in the process of developing a purpose-built fNIRS machine which, if their proof-of-concept trial is successful, they are hopefully could one day be used at other hearing centers.
Source: Bionics Institute, Herald Sun; Featured image: Herald Sun, via David Caird
A good investigative tool that may help in selection of an implanted device. What must be determined is that even if speech coding is detected at the hippocampus, will the language interpretation happen? Imaging does help note responsive flashpoints, but the short term memory functions and LTP may not be evaluated by this process.