Share this article on:

Listening in the Din: A Factor in Learning Disabilities?

Kraus, Nina PhD; White-Schwoch, Travis

doi: 10.1097/01.HJ.0000471628.60865.b9
Hearing Matters

Dr. Kraus, left, is a professor of auditory neuroscience at Northwestern University, investigating the neurobiology underlying speech and music perception and learning-associated brain plasticity. Mr. White-Schwoch, right, is a data analyst in Dr. Kraus's Auditory Neuroscience Laboratory (brainvolts.northwestern.edu http://www.brainvolts.northwestern.edu/), where he focuses on translational questions in speech, language, and hearing.

Early childhood is about listening. However, our acoustic environments often compromise sound-to-meaning mapping: children are bombarded by a relentless din. While noise presents a challenge for all of us, children face special difficulty because their language skills are under development, and their brains are not yet tuned to extract meaningful sounds from noise automatically.

Our view is that background noise disrupts the brain mechanisms important for language development. Several lines of evidence support this hypothesis:

* Children with auditory processing disorder—a hallmark of which is difficulty listening in noise—are often diagnosed with reading impairment. Children with dyslexia and specific language impairment also struggle to understand speech in noise.

* Noise disrupts the neural coding of consonants more than vowels; it's critical to learn consonants as part of a phonemic inventory.

* Augmenting classroom signal-to-noise ratios through assistive listening devices leads to stronger reading outcomes and underlying brain functions.

Back to Top | Article Outline

PREDICTING LITERACY

We use electrophysiology to evaluate how the brain makes sense of speech in noise and hone in on detailed aspects of how consonants are encoded. In particular, we ask how fast the response is, how well key harmonic frequencies are encoded, and how consistently the brain responds.

We also administer standardized tests of pre-reading skills, such as phonological awareness (knowledge of what sound contrasts are meaningful in speech) and rapid naming (ability to fluently recite written symbols). We recently asked how these responses predict early literacy skills, using statistical techniques that allow each of the three metrics just mentioned to make a unique contribution.1

In 4-year-old children, the integrity with which consonants were processed in noise strongly predicted early literacy. The relationship was consistent and profound, and the accuracy was remarkable: in about 85 percent of children, we could predict performance within three points.

Next, we tested 3-year-old children who were too young to take the behavioral test but could undergo electrophysiological testing. The same measures of speech processing predicted performance on early literacy tests one year later. In fact, we could make accurate longitudinal forecasts in all children.

Finally, we could predict school-age children's performance on a constellation of literacy skills using the same neurophysiological measures. In addition, electrophysiology correctly identified 75 percent of children who had a learning disability and 90 percent of children who were typically developing.

Therefore, this method represents a powerful approach to distinguish which children are likely to succeed in the literacy development process and identify a smaller group of candidates for in-depth evaluation and treatment.

Back to Top | Article Outline

WHAT ARE THE IMPLICATIONS?

One clear outcome of this work is the discovery of a biological marker for early literacy. Electrophysiology is objective and fast, and it requires minimal cooperation from the patient. In 20 minutes, we capture a biological looking glass into the hearing brain and its progress in reading development.

Early interventions are extremely effective in staving off a lifelong struggle to read, but these approaches are most efficacious before age 5½. If we can identify children early who are at risk for literacy difficulties, we can provide them with the necessary tools to jump-start reading development.

Audiologists have a clear role in this process. Few healthcare providers are such experts in evaluating young children, and our electrophysiological approach is similar to the auditory brainstem response (substituting consonants in noise for clicks or tones). Indeed, these responses are reliable in infants, and we would argue that consonants in noise should be used to screen the newborn hearing brain for language impairment.

Regarding intervention, these results highlight the role that hearing in noise plays in language development. Our radical idea is to provide listening-in-noise training to very young children. If we can teach children to zero in on speech in noisy environments, we can transform everyday listening into everyday learning.

Back to Top | Article Outline

Reference

1. White-Schwoch T, Woodruff Carr K, Thompson EC, et al. Auditory processing in noise: a preschool biomarker for literacy. PLoS Biol 2015;13(7):e1002196 . http://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.1002196
Copyright © 2015 Wolters Kluwer Health, Inc. All rights reserved.