Share this article on:

Auditory Brain Development in Children With Hearing Loss – Part One

Wolfe, Jace PhD; Smith, Joanna MS

doi: 10.1097/01.HJ.0000503459.97846.5d
Tot 10

Dr. Wolfe, left, is the director of audiology at Hearts for Hearing and an adjunct assistant professor at the University of Oklahoma Health Sciences Center and Salus University. Ms. Smith, right, is a founder and the executive director of Hearts for Hearing in Oklahoma City.

Editor's Note: This is the first installment of a two-part article. The conclusion will be published in the November issue.

Dr. Carol Flexer aptly puts it: It's all about the brain. We hear with our brain. The ears are just the way in. Early identification of hearing loss and intervention must occur during the critical period of language development in the brain. Listening happens in the brain, not in the ears.

Figure.

Figure.

As pediatric hearing healthcare professionals, we are familiar with these mantras and catchphrases. In fact, we have heard and said these slogans so much that they almost seem like clichés. But there are power truths behind clichés that make them stand the test of time. Too often we lose sight of the exact origins of clichés and buzz-phrases that underlie our lives.

Figure.

Figure.

In this two-part article, we provide a brief overview of several relevant research studies examining the effects of hearing loss and audiologic intervention on auditory brain development. This summary includes a survey of research investigating auditory brain development in humans and animals, with a focus on the work of Andrej Kral, MD, PhD, one of the most prolific scholars in the area of early auditory brain development. His research on deaf white cats has substantially advanced our understanding of the influence of deafness and cochlear implantation on auditory brain development. We would be remiss if we failed to acknowledge a number of other brilliant researchers from around the world who have contributed to this line of study (including but not limited to Chris Ponton, Jos Eggermont, Anu Sharma, David Pisoni, David Ryugo, Bob Harrison, Karen Gordon, Lynne Werner, Nina Kraus, and Patricia Kuhl).

Back to Top | Article Outline

10. The Auditory Brain

Figure 1a.

Figure 1a.

The auditory brain extends far beyond Heschl's gyrus and is actually quite complex. In our auditory anatomy courses, we likely learned that auditory signals travel up the brainstem to the thalamus and the primary auditory cortex (otherwise known as Heschl's gyrus; Figs. 1a and 1b). Heschl's gyrus resides within the Sylvian fissure and courses medially from the superior temporal gyrus. Tonotopic organization is preserved throughout this trip, including within the primary auditory cortex; additionally, complex processing, which mediates functions—from basic simple detection to complex localization and extraction of a signal of interest from competing noise—occurs in groups of neurons at all levels of the auditory nervous system.

Figure 2a.

Figure 2a.

From the primary auditory cortex, auditory signals travel to the secondary auditory cortex, which has less defined boundaries and components compared with the primary auditory cortex. Figure 2a shows an elementary example of many areas in temporal lobe and beyond that are typically thought to comprise secondary auditory cortex. It is well known that the secondary auditory cortex plays a prominent role in our ability to understand speech. For instance, in 1874, Wernicke noted that an insult to Brodmann area 22, located in the secondary auditory cortex (Fig. 2b), results in an inability to understand speech.

Back to Top | Article Outline

9. When all goes right, the auditory brain's areas glow bright!

Figure 3.

Figure 3.

When a person has sufficient access to intelligible speech throughout the first few years of life, the auditory areas of the brain light up like Time Square in response to auditory stimulation (Fig. 3). Green et al. used positron emission tomography (PET) scan testing to image the areas of the brain that were responsive when post-lingually deafened adults listened to speech while using a cochlear implant (CI; Hear Res 2005;205[1-2]:184 http://bit.ly/2czClGh). To clarify, the participants had normal hearing during childhood, lost their hearing as adults, and received a CI after a variable range of duration of deafness (1 to 48 years). As shown in Figure 3, a broad area of activation was seen in the auditory areas of the brain. Specifically, activity in response to auditory stimulation was observed both in primary and secondary auditory cortices. Also of note is that this broad area of auditory activation occurred bilaterally, even though the participant was listening with a CI only on the left ear.

Back to Top | Article Outline

8. Secondary auditory cortex is the launching pad.

Figure 4.

Figure 4.

Secondary auditory cortex is like the launching pad of the auditory area of the brain. The complex connections between the secondary auditory cortex and the rest of the brain are not entirely elucidated, but research shows that the secondary auditory cortex has multiple connections to other areas of the brain and back to the primary auditory cortex in the form of efferent tracts. The connections between secondary auditory cortex and other areas of the brain are often referred to as intra-hemispheric connections. An example is the arcuate fasciculus, which connects the temporal and the frontal lobe (Fig. 4). Numerous others exist, as well as connections to areas inferior to the cerebrum, such as the hippocampus.

The secondary auditory cortex also sends a robust number of efferent fibers back to primary auditory cortex (i.e., feedback projections). It has been proposed that efferent fibers from the secondary auditory cortex likely play a role in tuning the primary auditory cortex, to focus on primary signals of interest (David. Proc Natl Acad Sci USA 2012;109:2144 http://bit.ly/2clZyZc). Typical real-world environments are fraught with a cacophony of speech and environmental noises. For successful communication to occur, the auditory system must be able to focus on the acoustic elements of the listener's spoken language—an ability called feature representation or feature extraction (Allen. IEEE Trans. on Speech and Audio Processing 1994;2[4]:567; Kral. Brain Res Rev 2007;56[1]:259 http://bit.ly/2cm0psP; Kral. e-Neuroforum 2015;6:21 http://bit.ly/2cm0gFU). Research has shown that infants as young as 4 months begin to attend to the phonemes of his/her primary language while showing weaker responses to phonemes of foreign languages (Dahaene-Lambertz. Trends Neurosci 2006;29[7]:367 http://bit.ly/2cm1gJX).

Researchers have suggested that the primary auditory cortex detects acoustic features that an individual deems important, and the secondary and higher-order areas combine these features into meaningful representations (i.e., auditory objects; Kral. Neuroscience 2013;247:117 http://bit.ly/2cm3sBq; Kral, 2007 http://bit.ly/2cm0psP). Higher-order auditory areas contain pluripotent neurons that respond to multiple modes of stimulation (e.g., a neuron that responds to auditory, visual, and tactile stimulation), possibly enabling multi-modal integration. Additional research is needed to fully understand the roles of pluripotent neurons in the secondary auditory cortex.

Back to Top | Article Outline

7. I like bacon!

Figure 5.

Figure 5.

We have yet to develop a full understanding of exactly how and where auditory objects are represented in the brain. Deriving higher-order meaning from the sound we hear is certainly a complex process. “Fundamentally, everything that comes into our minds reduces to patterns of neural activity,” according to Kai-How Farh, MD, a clinical geneticist at Boston Children's Hospital. In other words, each cognitive experience is represented by a unique network of neurons that produces the reality we perceive. For instance, when we hear the word “yellow,” a certain set of neurons responds across the brain to produce the experience we associate with the word. Figure 5a illustrates a network of neurons that may move across the brain to represent the word “yellow.” Most of the responsive neurons reside in the primary and secondary cortices, but there are also neurons that respond from the frontal and parietal lobes (and possibly even in the occipital lobe or multimodal areas of secondary auditory cortex).

Engagement of the frontal lobe allows us to extract higher-order meaning from the word “yellow.” For instance, we may conclude that we dislike the color yellow or that yellow is our favorite color. Furthermore, we may associate the color yellow with a traffic light, a canary, a favorite shirt, or a banana. Neurons responding in the frontal and parietal lobes also likely contribute to our ability to produce or speak the word “yellow.” Finally, pluripotent neurons in the secondary auditory cortex or neurons within the occipital lobe interconnected with the secondary auditory cortex also respond to allow us to form an image of yellow in our mind's eye.

Similarly, a unique network or pattern of neurons respond when we hear bacon frying in a pan (Fig. 5b). That distinct sizzle in a pan elicits responses from neurons throughout primary and secondary auditory cortices. Even without seeing the bacon frying in the pan, we can form an image of it in our mind's eye because of integration between auditory-responsive neurons in the secondary auditory cortex and visually responsive neurons. We remember how bacon tastes and feels, and may even begin to salivate as we hear the frying sound, all because of the integration between neurons.

In short, each sound that comes into our minds from our ears is reduced to a unique pattern of neural activity. For that sound to possess higher-order meaning and come to life, it has to travel from the primary to the secondary auditory cortex and form a neural network or connectome with multi-modal areas throughout the brain. Once again, the secondary auditory cortex serves as the launching pad for this interaction and integration.

Back to Top | Article Outline

6. Primary auditory cortex was born to hear.

Figure 6.

Figure 6.

In 1999, Nishimura and colleagues published their ground-breaking research that employed PET scan testing of brain responses to a variety of stimuli (Nature 1999;397[6715]:116 http://bit.ly/2cm48GV). The participants were pre-lingually deafened adults who had no auditory experience and used sign language prior to receiving a CI in adulthood. Nishimura et al. observed neural activity in the brain in response to three different stimuli: running speech, sign language, and meaningless hand movements. As shown in Figure 6, neural activity was observed in the primary auditory cortex contralateral to the implanted ear (but not in the primary auditory cortices of both hemispheres) when the subjects listened to running speech. This finding indicates that the primary auditory cortex is hard-wired for sound. Even when a person does not have access to intelligible speech during the critical period of language development, the primary auditory cortex still responds to auditory stimuli, as seen in young adults initially introduced to sound via CI after a lifetime of deafness. However, extended duration of deafness likely reduces the responsiveness and reorganization of the primary auditory cortex (Kral. Trends Neurosci 2012;35[2]:111 http://bit.ly/2cm4s8z). Nonetheless, the primary auditory cortex remains largely responsive to sound. This finding may explain why adults who never had access to sound and communicated via sign language their entire lives can still detect whisper-level sounds with use of a cochlear implant.

Copyright © 2016 Wolters Kluwer Health, Inc. All rights reserved.