New ways to diagnose and treat individuals who cannot speak, hear, or process language might not just ensure the right care—early intervention could also help treat or avoid other related disorders, according to findings presented by researchers on 14 February at the AAAS Annual Meeting.
Hearing loss might not just be an inevitable and inconsequential part of aging, said Frank Lin, an associate professor of otolaryngology-head and neck surgery, geriatric medicine, mental health, and epidemiology at Johns Hopkins University.
The loss of hearing might be a direct risk factor for cognitive decline—and contribute to the onset of dementia, he said.
Lin said there are several ways that healing loss may affect cognitive functions and increase risk of dementia. If the brain is constantly receiving garbled messages from the ear, coping with degraded auditory messages comes at the expense of higher-level brain function like memory, he said. He also cited neuroimaging studies showing that prolonged periods of reduced audio stimulation can lead to faster rates of atrophy, changing the brain’s structure Additionally, social isolation caused by hearing loss may contribute to further brain decline, Lin said.
“If we address hearing loss and we treat it well with things like hearing aids and counseling strategies, can we actually reduce the risk of cognitive decline and dementia?” asked Lin.
The answer is still unknown, but Lin is currently working to develop a clinical trial of 800 patients to determine whether hearing loss treatments in older adults can reduce brain decline. Results from a 60-person pilot study will be available after the pilot concludes in the next three-and-a-half months, Lin said.
Robert Voogt, who has primary progressive aphasia, types out a response on a communication device to answer questions at the AAAS Annual Meeting on 14 February. | Andrea Korte/AAAS
Joseph Duffy, a professor of speech pathology at the Mayo Clinic, is studying links between a particular speech disorder and other neurodegenerative problems.
If we want to communicate a thought through speech, Duffy said, we first formulate our language in the brain and then activate 100 different muscles using about 140,000 neuromuscular events per second to produce speech. Between this language formulation and execution is a process called motor speech programming—what the brain does to select, organize, and package the instructions to the muscles of speech, he said.
“When the brain is injured or damaged in some way, we can damage that programming mechanism, and when we do that we have what we call apraxia of speech,” Duffy said.
Primary progressive apraxia of speech can be a precursor to other neurodegenerative problems, possibly the first or only indicator, Duffy said. About half of patients with primary progressive apraxia of speech will develop neurological difficulties within four or five years, usually motor control issues—for instance, difficulty controlling eye or limb movements, urinary incontinence, or a syndrome called progressive supernuclear palsy.
Duffy’s research has found primary progressive apraxia of speech predicts the underlying pathology of these neurodegenerative problems: an abnormal protein called tau. Tau’s distribution in the brain’s neurons is similar to the distribution of primary progressive apraxia of speech and in progressive supernuclear palsy, he said. This link will hopefully have implications for future treatment, Duffy said, allowing early treatment for people with neurodegenerative difficulties.
A similar but distinct language disorder called primary progressive aphasia can rob an individual of his or her ability to speak clearly, said Argye Elizabeth Hillis, professor and director of the cerebrovascular division of neurology at the Johns Hopkins Hospital. Primary progressive aphasia is often mistaken for Alzheimer’s disease or age-related dementia, but the neurodegenerative disease leaves patients’ thoughts perfectly lucid—while their ability to clearly express them declines.
The use of a particular type of brain imaging, resting state FMRI, can aid early diagnosis, Hillis said. The testing looks at how well different parts of the brain—the left and right prefrontal cortices—operate together at rest, which helps determine the rate of decline of speech abilities in the patient, she said.
Testing can also help doctors identify the semantic variant of aphasia, which causes difficulties comprehending word meanings. Hillis noted that it can be difficult to determine early on which variant of aphasia a patient has, as most patients begin by struggling with naming and spelling of words, regardless of variant. Yet tracking patients’ eye movements while they match words with pictures reveals a loss of confidence in word meanings. Patients who will develop the semantic variant look back and forth between the correct pictures and other unrelated pictures, even when they select the right picture, Hillis said.
“Their eyes reveal their lack of confidence about the meanings of words,” she said.
To augment speech and language therapy, the traditional treatment for patients with primary progressive aphasia, Hillis is also studying the use of transcranial direct current stimulation of the brain’s interior front gyrus.
The treatment may increase synaptic plasticity, meaning it could “help unaffected parts take over for the damaged parts of the brain,” Hillis said.
Hillis’ preliminary study of six patients continued to show language improvement two weeks and two months later, and a clinical trial funded by the U.S. National Institutes of Health is now underway.
Such treatments could help patients with primary progressive aphasia like Robert Voogt. In 2006, Voogt was assistant professor of clinical internal medicine at Eastern Virginia Medical School in Norfolk, Virgnia, when he began having difficulties with speech. Today, Voogt has been diagnosed with the non-fluent/agrammatic variant of primary progressive aphasia. Although his expression of speech is unintelligible, Voogt’s understanding of language is clear.
Voogt responded to questions at the AAAS meeting using a MiniTalk communication device, typing out brief responses and selecting pre-programmed sentences that the device translated into speech.
“I have trouble speaking but I can understand you,” Voogt said through the device.