1 First, what exactly do you mean when you say “localization”?
Localization is a term used to describe how it is that we know the location of a sound-producing object. Directional hearing is one part of localization. This activity allows us to know the direction of the sound source in three-dimensional space. Distance hearing is another part. This tells us how far away a sound source is.
2 Does it really matter if we have good localization?
It certainly does. Only the visual and auditory senses extend beyond the boundaries of our bodies, and these senses serve different purposes. The visual sense gives us tremendous resolution over a small area, while hearing enables us to monitor what's going on in all directions and helps us decide where to direct visual attention.1 Also, our awareness of the position and movement of sound sources is important in providing us with a sense of psychological comfort and security in a listening environment. Finally, and perhaps most convincingly, in interviews, hearing-impaired people volunteer that localization and other aspects of spatial hearing are important activities that are limited by hearing impairment.2
3 It's been a while since my psychoacoustics class. Can you remind me how we localize sounds?
We must first divide the direction into two axes. The first axis is the horizontal plane, which is the right-left dimension (also called azimuth). The other axis is the vertical plane, which is the up-down dimension (also called elevation).
Many cues have been shown to be important in horizontal plane localization, and each cue seems to take on different importance depending on listening conditions. For the environment that we know the most about, one with extremely low noise and virtually no reflections, monaural and binaural cues both operate. One of the monaural cues is the overall loudness of the sound. Sounds that come from the side ipsilateral to the ear will have a greater overall level than those coming from the contralateral side. Also, the spectrum of the sound heard by the listener will change with the orientation of the sound source to the head and body. Diffraction of sounds around the features of the pinna causes sharp peaks and dips in the frequency response of the pinna, and these spectral changes are audible.3,4 It is important to note that these cues rely on the presence of a stored prototype, or template, of the sound,5 so cognition is involved in the use of monaural cues.
4 I remember that there are also cues that require signals from both ears. What are those again?
The binaural cues have received a great deal more attention from researchers. One of these cues is the interaural phase difference. The distance from the sound source to the ear changes with the angle of the sound source relative to the listener. Because our ears are located on opposite sides of the head, the sound path to one ear is either shorter or longer than the sound path to the other ear unless the sound source is directly in front of or directly behind the listener. For example, if a sound source is located anywhere to the left side of the listener's head, the path to the left ear will be shorter than to the right ear.
The maximum difference in the length of the path traveled by sound occurs when the source is directly to one side of the listener's head (i.e., 90° or 270° from directly in front of the listener). These differences in length mean that one ear will receive a signal a short time before the other ear. Under ideal conditions, the human auditory system can detect time differences of approximately 10 μs,3 which correspond to only a few degrees on a circle.
High-frequency interaural time differences are also used as cues, especially when no other cues are available. Just as in the low frequencies, it takes time for the high frequencies to travel the additional distance to the far ear. In the high frequencies, the time difference is coded not by the fine stimulus characteristics (i.e., the instantaneous amplitude function), but by the stimulus envelope.6
The location of a sound source can also be inferred from differences in sound level across ears. These differences occur in frequencies where the wavelength of the sound is small relative to the dimensions of the head. These differences become useful at frequencies of 3000 Hz and above.7 For sound sources more than 1 to 3 meters from the listener, the magnitude of interaural level differences is a maximum of 5 dB between 800 Hz and 1000 Hz, but increases to a maximum of over 30 dB between 8000 Hz and 10,000 Hz. For closer sound sources, interaural level differences occur at much lower frequencies, again because of how the head interacts with waves having significantly curved surfaces.
5 So far you've only talked about the horizontal plane. What about the cues used in vertical plane localization?
To determine the direction of a sound in the vertical plane, the primary acoustic cue is generated by the pinna. When sound reaches the listener from different elevations, the concha has different effects on the sound, creating sharp peaks and dips in the signal spectrum. The listener uses these as a cue to the vertical location of a sound source by comparing the observed spectrum of the sound with the expected spectrum and by applying his experience with the general effects of the concha to infer the location of the sound source.
6 Earlier you mentioned distance. How does that fit into all of this?
The way we perceive distance tends to change with the distance and the listening conditions. I'll base the answer on a stationary listener, because it is simpler and because different cues could be available when the listener is moving.8
If the sound source is closer than 3 meters, complex spectral differences can be used as a cue. For the human head, the surface of the sound wave would be significantly curved at distances less than about 3 meters. This changes the way that the head and pinna alter the sounds, and therefore becomes a cue to the distance of the sound source.
For sound sources about 3 meters away from the listener, the primary cue is the overall level of the sound, which decreases as the distance to the sound source increases.3 Although a strict interpretation of this statement suggests that all low-level sounds have sources far away from the listener, it is not that simple. The listener's expectations of the sound level are compared with his observations. For example, one would expect that the sound level of a slammed door would be loud, so if a listener finds this sound to be softer than expected, he will conclude that the source was far away.
In environments with significant reverberation, the difference in sound level between the direct and reflected sound can also serve as a cue.3 For faraway sound sources (>15 m), there is an additional spectral cue. Air functions as a low-pass filter, so high-frequency energy is reduced with distance, providing an additional spectral cue.
In addition to cues that are provided by physical acoustics, the cognitive cues of expectations and familiarity with the stimulus are also important. Familiarity with a sound aids helps the listener estimate its distance correctly. For familiar sounds, distance judgments can be accurate until the sound source is up 6 to 8 meters away. So, if a listener knows the typical acoustic characteristics (e.g., typical loudness, average frequency spectrum) of the sound, the listener can better judge its distance. For unfamiliar sounds, the range where accurate judgments can be obtained is reduced to 2 to 4 meters.3
7 You mentioned that horizontal plane localization cues have been extensively studied in listening environments with low noise and few reflective surfaces. Can we generalize these findings to the real world?
You make an important distinction. An anechoic environment is a very special place. As you probably know, anechoic chambers are spaces that absorb acoustic energy far more effectively than listening environments such as a sound booth, and over a wider frequency range. For this sort of research, it is important first to understand how the auditory system works in anechoic environments, because these environments are acoustically simple. The importance of some localization cues will change with the listening environment. So, although knowledge of how the auditory system works in anechoic conditions is important, researchers must also determine how the system works in more realistic environments.
8 What happens when we listen near reflective surfaces?
Acoustic energy coming from a single source acoustically appears as if it were coming from a large number of similar, but spatially distributed, sound sources. Reflected signals behave as if they come from a totally different sound source beyond the boundaries of the room.3 So, in the presence of reflections, the auditory system has to contend with many different sound sources, even though only one physical sound source may exist. The auditory system handles these phantom sound sources by attending to only the first signal that arrives, using a complex, cognitively driven process called the precedence effect.3
In listening environments with reflective surfaces, some localization cues can become inaudible or unreliable. In such cases, the auditory system tends to use the cues that are most readily available and most consistent with the listener's expectations of the acoustic environment. The auditory system's consideration of the plausibility of cues is a challenging phenomenon to understand, and makes it difficult to generalize results from anechoic environments to more common environments.
A mechanism (e.g., low-frequency interaural time differences) that yields very plausible cues in one environment may not do so in another. In acoustically complex environments, low-frequency interaural time differences are largely ignored because they tend to give implausible cues since the auditory system can no longer separate the sound that comes directly from the sound source and the sound from the phantom sound source. The cognitive portion of the auditory system recognizes this situation and ignores the cues.
9 Are you saying that localization involves more than the acoustical properties of the stimulus?
Yep. Lots more. It is difficult to discuss localization in real environments without invoking the cognitive traits of sensory memory and expectations. These are what make it so difficult to interpret basic localization studies in terms of real-world effects. The stimuli are constantly changing, the importance of cues is variable, and the listener's experience and expectations of the sounds are also dynamic. Currently, we know quite a lot about how things work in rigidly controlled environments, but it is possible that the rules are substantially different in more complex environments.
10 We often need to localize a sound in a noisy environment. Does background noise affect localization?
Noise is an important concern in localization. Listeners are regularly in environments with poor (adverse) signal-to-noise ratios.9 Although most localization research has been conducted in quiet, a few studies have been done in the presence of noise.10-13 In general, these studies have shown that horizontal plane localization can be resistant to the effects of noise, with some listeners nearing perfect performance at an SNR as low as 2 dB. However, not all listeners obtain equally good localization scores in these conditions.
11 So far you've been discussing normally hearing listeners. How does hearing impairment affect localization?
Problems with localization occur even with mild hearing impairments.14 The biggest effect of hearing impairment on localization is a lack of audibility. To state the obvious, a sound must be heard before its source can be correctly located. Horizontal plane localization scores tend to remain stable until bilaterally averaged thresholds (500 Hz to 4000 Hz) are worse than about 50 dB HL, even in an environment with a low signal-to-noise ratio.10 However, subjects with poorer thresholds tend to have poorer localization, and this decrement cannot be attributed entirely to audibility.13
12 Does the type of hearing impairment matter?
Yes, it does. Listeners with sensorineural hearing impairment tend to retain their localization skills better than those with conductive losses. People with conductive impairments have more trouble with localization because a larger proportion of sound energy is transferred through the bones of the skull as the conductive pathology gets worse. As a result, signals from both ears reach both cochleas at nearly the same time and energy level. This reduction in bilateral cochlear isolation causes a reduction of binaural differences; hence binaural localization cues are also reduced.
Hearing aids increase the proportion of energy that is transferred via the air-conduction pathway, thus increasing the cochlear isolation and making binaural differences more detectable.15
13 Can hearing aids restore localization to normal?
Hearing aids increase the amount of audible sound. In terms of horizontal plane localization, the restoration of audibility can provide a listener with nearly normal skills, but, in some cases, hearing aids can impair horizontal localization.16 In many of these cases, it appears that the hearing aid interferes with low-frequency interaural time differences, and the use of an open earmold can improve performance.
In the vertical plane, hearing aids do little to restore localization cues. This is because these cues reside above the audible frequency range of most hearing aid users and because the concha resonances responsible for vertical plane localization cues are disturbed by most hearing aid earmolds. We don't know much about what hearing aids do to distance perception, but it seems logical that the size of the listener's auditory field should increase after the hearing aid fitting, so, once accustomed to the sound of the hearing aid, listeners should have improved distance perceptions.
14 Binaural hearing aid fittings improve localization, don't they?
Sometimes, but first we must distinguish binaural hearing aid fitting from bilateral hearing aid fitting from binaural hearing. For example, a listener with a mild, symmetrical hearing impairment and only one hearing aid will regularly hear moderate and high-level signals through both ears, making binaural cues available. Therefore, a unilateral hearing aid fitting does not necessarily imply monaural hearing and, conversely, a bilateral hearing aid fitting does not guarantee binaural hearing. The main question is how many important signals were heard only monaurally before the hearing aid fitting.
For sounds intense enough to provide binaural cues in the unaided condition, hearing aids tend to make localization a bit poorer,17 at least at the start. However, for worse hearing losses, this disadvantage is outweighed by the increase in the number of audible sounds. The borderline where one can expect a localization benefit seems to be an average hearing level (500 Hz to 4000 Hz) around 50 dB HL.18 People with thresholds worse than this borderline will tend to notice improved localization; those with better hearing may not perceive much difference in average listening environments.
15 I've heard that directional hearing aids help directional hearing. Does that make sense?
Not really. Unfortunately, the same word is used to describe totally different concepts. The term “directional hearing” refers to a listener's ability to infer the direction of a sound source, which is a type of localization where distance is not a concern. Directional microphones in hearing aids are designed to have lower sensitivity for sounds that don't come from directly in front of the listener. This reduces audibility for signals coming from other directions, which should make it harder, not easier, for the auditory system to detect localization cues.
16 So, can I expect my clients wearing directional microphones to have poorer localization?
Fortunately, the situation is not that simple. Directional microphones are not sufficiently effective to eliminate localization cues altogether. Also, many localization cues can be received from any sound source, and the auditory system takes advantage of this redundancy. Earmold/vent systems pass low-frequency cues without much attenuation,19 so these cues are often retained regardless of the directional microphone. Also, it is important to recognize that cognitive localization cues will not be impacted by a directional microphone, which can help listeners overcome reduced acoustic information.
17 Does the size or style of hearing aid matter for localization?
Once the effects of audibility have been controlled, few differences can be expected in the horizontal plane. However, for new hearing aid users, it is possible that the transition from unaided to aided localization will be easier with CIC-style hearing aids.17,20 Few hearing aids are likely to make vertical plane localization cues audible, but in cases where such amplification is obtained, the completely-in-the-canal style would likely be preferable because the microphone is located inside the entrance to the canal and therefore does not interfere with the generation of vertical localization cues in the concha bowl. The usefulness of these acoustic cues depends on whether or not the listener has sufficient frequency resolution to discriminate the spectral peaks and dips.
18 I've heard that the type of earmold has an im-pact on localization. Is that true?
There is reason to believe that open or sleeve earmolds or tube fittings are desirable for people with good hearing in the high frequencies and poor hearing in the low frequencies.15,21 These types of molds pass low-frequency signals without phase distortion, thus making them available to the auditory system. For people with good low-frequency hearing and poor hearing in the high frequencies, earmolds that do not pass low-frequency signals well should be avoided. This means that it is desirable to have larger vents and, if gain and feedback considerations allow, open or sleeve-style earmolds.15
19 What about processing delay times in digital hearing aids? Can they interfere with localization?
Digital hearing aids give us tremendous power to alter sounds in remarkable ways. Although the potential benefits are clear, there is also the possibility of entirely new varieties of distortion, such as digital processing delay times. One should be concerned with delay as a cause for problems with the sound quality of the user's own voice.22,23
Regarding localization, the primary problem I can see with long delay times is destructive interference between the acoustic signal that passes by the hearing aid and the amplified sound. For some amounts of delay, the interference could move into frequency regions where the audibility of localization cues is reduced.
A second concern is that the delay times might not be bilaterally matched, either because of a unilateral fitting or because of mismatched delays for bilateral hearing aid users. However, if the mismatch in delay time does not change often, people adapt to new interaural time difference cues over a period of hours or days.24-26
In signal processors with constantly changing delay, it seems likely that implausible time-based cues will result and that this class of cues will be neglected by the auditory system. This may or may not have an impact on localization, depending on whether or not more reliable cues are available.
20 It sounds as if my pa-tients could vary quite a lot on localization ability and that the effects of hearing aids are not easily predictable. How can I find out about my clients' localization before and after the hearing aid fitting?
You'll have to test them. This seems to be an area where questionnaires are preferable to psychophysical tests. Although the controlled environment of a sound booth or headphones simplifies the interpretation of other tests, it complicates the interpretation of localization tests.
Four questionnaires that contain items dealing with localization are the Localization Abilities in Typical Environments (LOCATE) questionnaire,27,28 the localization scales in the Amsterdam Inventory of Hearing Disability and Handicap,29 the Hearing Measurement Scale,2 and the Gothenburg Profile.30 Each has been shown to have good validity and reliability as measures of localization skills, and many items are shared across the questionnaires. Depending on the situation, it may be desirable to have the focused evaluation of localization provided by the LOCATE questionnaire, or it may be preferable to have localization items mixed in with other topics, which is how the other three questionnaires are designed.
In recent months, many of you have heard about the unique fly named the ormia ochracea. Assisted by a pair of tympanic membranes on both sides of its body, this fly can precisely locate the chirp of a cricket at great distances. This extraordinary localization ability is critical to the fly's existence, as it deposits hundreds of tiny larvae on the cricket's back. While most of us do not rely on auditory localization for reproduction purposes, it nonetheless is a very important aspect of our hearing world.
Auditory localization is facilitated by several factors, including interaural phase and timing differences, overall loudness, the spectrum of the signal, and our long-term experiences. Good localization enhances our safety and provides practical cues that assist in speech communication. Localization of sounds in our everyday environment also provides a level of psychological comfort that many of us take for granted. When new hearing aid users say, “Everything just seems so normal,” it just could be that improved localization is contributing to this perception.
Speaking of hearing aid users, given the factors that are important for localization, it's not surprising that hearing loss has a significant impact on localization ability. For the most part, this is due to a reduction in audibility, which means that when bilateral (binaural) hearing aid fittings are fitted (especially the CIC style) on people with only mild-to-moderate hearing loss, localization ability can be restored to near-normal levels. But, sometimes hearing aid fittings can make things worse.
Here to tell us all about auditory localization and the interaction with hearing loss and hearing aid fittings is Greg Flamme, PhD, an assistant professor in the Department of Speech Pathology and Audiology at the University of Iowa. After cutting his audiologic teeth close to home at the “Cornhusker Hearing Center” in the early 1990s, Dr. Flamme moved on to work with Robyn Cox at the University of Memphis, and, more recently, with Ruth Bentler at Iowa. As you might guess, hearing aid performance is one of his research interests, to which he's added a heavy dose of biostatistics.
While at the University of Memphis, Greg worked on the development and standardization of the Localization Abilities in Typical Environments (LOCATE) questionnaire. And as you can tell from this excellent Page Ten article, he is continuing to study and critically examine various methods for measuring this important hearing function.
While auditory localization may not be as critical for us and our patients as it is for the ormia ochracea, it is an important and sometimes overlooked area of rehabilitative audiology.
Page Ten Editor
1. Sekuler, R, Blake R: Perception
, 3rd ed. New York: McGraw-Hill, 1994: 105–106.
2. Noble WG, Atherley GRC: The Hearing Measure Scale: A questionnaire for the assessment of auditory disability. J Aud Res
3. Blauert J: Spatial Hearing: The Psychophysics of Human Sound Localization
. Cambridge, MA: MIT Press, 1997.
4. Moore BC, Oldfield SR, Dooley GJ: Detection and discrimination of spectral peaks and notches at 1 and 8 kHz. J Acoust Soc Am
5. Durlach NI: Binaural signal detection: Equalization and cancellation theory. In Tobias TJ, ed. Foundations of Modern Auditory Theory
. New York: Academic Press, 1972: 369–462.
6. van de Par S, Kohlrausch A: A new approach to comparing binaural masking level differences at low and high frequencies. J Acoust Soc Am
7. Wightman FL, Kistler DJ: Factors affecting the relative salience of sound localization cues. In Gilkey RH, Anderson TR, eds. Binaural and Spatial Hearing in Real and Virtual Environments
. Mahwah, NJ: Lawrence Erlbaum Associates, 1997: 1–24.
8. Mershon DH: Phenomenal geometry and the measurement of perceived auditory distance. In Gilkey RH, Anderson TR, eds. Binaural and Spatial Hearing in Real and Virtual Environments
. Mahwah, NJ: Lawrence Erlbaum Associates, 1997: 257–274.
9. Pearsons KS, Bennett RL, Fidell S: Speech levels in various noise environments. Project report on contract 68 01–2466. Washington, DC: Office of Health and Ecological Effects, US Environmental Protection Agency, 1977.
10. Flamme GA: The Relationships Between Direction and Distance Hearing and Other Auditory Traits: A Multitrait-Multimethod Evaluation
. (UMI Publication #9967035). University of Memphis, 2000.
11. Good MD, Gilkey RH: Sound localization in noise: The effect of signal-to-noise ratio. J Acoust Soc Am
12. Lorenzi C, Gatehouse S, Lever C: Sound localization in noise in normal-hearing listeners. J Acoust Soc Am
13. Lorenzi C, Gatehouse S, Lever C: Sound localization in noise in hearing-impaired listeners. J Acoust Soc Am
14. Kramer SE, Kapteyn TS, Festen JM: The self-reported handicapping effect of hearing disabilities. Audiology
15. Byrne D, Noble W: Optimizing sound localization with hearing aids. Trends Amplif
16. Noble W, Sinclair S, Byrne D: Improvement in aided sound localization with open earmolds: Observations in people with high-frequency hearing loss. JAAA
17. Ebinger, KA, Grantham DW, Trine TD: Sound localization with ITE and CIC hearing aids. Poster presented at the American Academy of Audiology Annual Convention, 1996.
18. Byrne D, Noble W, LePage B: Effects of long-term bilateral and unilateral fitting of different hearing aid types on the ability to locate sounds. JAAA
19. Dillon H. Hearing Aids
. New York: Thieme, 2001.
20. Agnew J, Ewert C: Study compares localization ability unaided and when using CIC hearing aids. Hear J
21. Byrne D, Sinclair S, Noble W: Open earmold fittings for improving aided auditory localization for sensorineural hearing losses with good high-frequency hearing. Ear Hear
22. Agnew J, Thornton JM: Just noticeable and objectionable group delays in digital hearing aids. JAAA
23. Stone MA, Moore BC: Tolerable hearing aid delays. I. Estimation of limits imposed by the auditory path alone using simulated hearing losses. Ear Hear
24. King AJ, Kacelnik O, Mrsic-Flogel TD, et al.: How plastic is spatial hearing? Audiol euro-otol
25. Florentine M: Relation between lateralization and loudness in asymmetrical hearing losses. J Am Aud Soc
26. Javer AR, Schwarz DW: Plasticity in human directional hearing. J Otolaryngol
27. Flamme GA, Cox RM, Alexander GC, et al.: Localization disabilities in real-world situations. Poster presented at the American Academy of Audiology Annual Convention, 1999.
28. Flamme GA: Examination of the validity of auditory traits and tests. Trends Amplif
29. Kramer SE, Kapteyn TS, Festen JM, Tobi H: Factors in subjective hearing disability. Audiology
30. Ringdahl A, Eriksson-Mangold M, Andersson G: Psychometric evaluation of the Gothenburg Profile for measurement of experienced hearing disability and handicap: Applications with new hearing aid candidates and experienced hearing aid users. Br J Audiol