LANGUAGE
get in touch

Physiology of APD

pThe Physiology of Auditory Processing

Sound reception occurs through a complex physiological process whereby sound waves travel through three sections of the ear:

  • The outer ear
  • The middle ear
  • The inner ear

Outer Ear – the pinna Sound is most accurately understood as what we hear when air-pressure fluctuates. These fluctuations produce sound waves of a certain frequency (pitch) and amplitude (volume). The sound waves travel from the outer ear – the pinna – into the ear canal (also known as the external auditory meatus). During this process the sound is slightly modified due to the shape of the outer ear and the canal.

Middle Ear – the eardrum, malleus, incus and stapes

The ear canal ends at the eardrum, which is stimulated by the sound waves and vibrates in response. The eardrum is connected to the malleus, the incus and the stapes, three tiny bones that amplify and conduct the vibrations to the oval window, part of the cochlea. There are also two muscles in the middle ear – one controls the tension in the eardrum (the tensor tympani) and the other plays a protective role in reducing the volume of very loud sounds (the stapedius). These are the muscles strengthened when the Tomatis Method is administered.

Inner Ear – the cochlea and vestibule

The cochlea and vestibule exist alongside each other in the inner ear. The cochlea is responsible for perceiving all sounds and the vestibule is responsible for receiving bone-conducted sound and maintaining the body’s balance and coordination. The cochlea is made up of three fluid-filled sacs – the scala vestibuli, the scala media and the scala tympani. The vibrations in the oval window cause waves to form in the fluid, and the energy from these waves is received by tiny nerve cells (cilia) in the basilar membrane. The cilia are sometimes referred to as hair cells, and they convert the waves into electrical impulses. Different hair cells will be stimulated depending on the frequency of the sound which allows the brain to discriminate between various pitches.

The function of the ear

The ear is the most fundamental of all sensory organs – it is the first to develop in the womb and is fully functional at birth. Where our eyes are only able to perceive light waves in a limited range of the spectrum, the ear can perceive sound that is either extremely soft or extremely loud, and everything in between. The drawback of such a sensitive organ is that it is harder to inhibit. Our eyes shut in response to bright light and we automatically draw back from painful touch, but there is no physical reflex for shutting out sound. Instead, the brain controls how the ear takes in information.

Because the ear functions as both a sound receptor (cochlea) and a balance coordinator (vestibule), it is neurologically linked to the whole body. It has been shown that the function (or dysfunction) of the auditory system can have an effect on emotional, motor and sensory areas, impacting on many aspects of a person’s well being.

There are two primary organs in the auditory system: the cochlea and vestibule. They are both located in the inner ear and although they attend to different information, their proximity allows them to complement each other. Alfred Tomatis was a strong proponent of the view that the cochlea and vestibule do not operate independently, and his therapies reflect this assumption.

Sound pressure waves from the external environment are transmitted through two routes to the inner ear. Air-conducted sound waves enter through the ear canal, and bone-conducted sound waves are directly transmitted to the inner ear. The cochlea analyses both types of sound. The function of the vestibular system is more complex as it responds to and controls the position of the body.

The vestibular system senses any loss of balance in the body and responds with reflex mechanisms that attempt to regain and maintain equilibrium through muscular movements. If a person is off balance, the two fluid-filled sacs in the vestibule react to this and employ the muscle movements required to return to a balanced position. The vestibular system acts upon knowledge of where the head is, and plays an important role in integrating all the other senses. Paul Madaule refers to the vestibular system as the ‘ear of the body’, which gives clear meaning to what has been described.

The vestibule as an organ of sensory integration and balance

The ear is the first sensory organ to develop in a foetus, and all subsequent development of the other sensory organs takes place with reference to the vestibular system. It is not surprising then that a poorly developed vestibule can be the source of problems in multiple areas. What should be understood from this is that the ear is not only for listening. It also plays a pivotal role in analysing all the new sensory information that the body receives, which is why auditory dysfunction can have such far-reaching effects.

Vestibular dysfunction can be recognised by a variety of symptoms, some of which include:

  • Poor balance and coordination
  • Difficulty with fine and gross motor skills
  • Confusion between left and right
  • Poor sense of direction and spatial awareness
  • Frequent disorientation
  • Active seeking of vestibular stimulation, such as rocking or active movement (hyperactivity)
  • Avoidance or fearlessness of heights.

Gravity acts on the body as a force for both centre and balance. The vestibule makes use of gravitational information to determine where the body is positioned and how best to move based on the body’s centre of gravity at a given time. The vestibule also allows us to comprehend and make use of the three-dimensional nature of the world. The concepts of time and space are elements of this three-dimensional perception – for example, when a ball is thrown to us we can anticipate when we will need to clasp our hands to receive it (time) based on how far away the other person is (space).

In addition to this, the vestibule’s function as a sensory integration mechanism means that all space and time concepts associated with each sense can only be appreciated if the vestibular system is operating efficiently. All information from the muscles, joints, ears and eyes are first transmitted to the vestibule before further processing can take place.

For example, a group who is performing a choreographed dance must associate the auditory cues from the music (integration) with the learnt movements (coordination) so that they occur simultaneously (time), and they must also be aware of the room available to them on the stage (space). The centrality of the vestibule means that any disruption to the system has a detrimental effect on all areas of functioning, as the multiple cues received from the senses cannot be arranged to form a coherent understanding of the environment or task.

In terms of the vestibule’s role in balance, it receives information from three main sources: our vision, our body’s position, and activity within the vestibule itself. It uses this data to monitor four main areas essential to balance. Firstly, it helps to maintain posture; secondly, it assists in giving us physical awareness of our head, body and movement; thirdly, it controls the fine movements of the eyes; and lastly, it organises our motor responses so that we are able to perform physical activities requiring coordination.

The vestibular system directly communicates its information through the vestibular nerve to the cerebellum (the part of the brain responsible for receiving sensory information). The cerebellum allows us to develop an awareness of our body’s position, weight and movement by appropriately integrating and organising the sensory information. The cochlea and vestibule are often described as independent systems, but they are located beside each other and share some of the same anatomical components, including the same cranial nerve. It comes as no surprise then that the vestibular system is heavily influenced by sound although its primary function is generally understood as the control of balance and coordination.

Because the vestibular system is responsible for the very fine movement of the eye and its visual tracking capability, children with APD frequently not only have poor coordination but also lack the ability to focus and control their eye muscles. This skill is essential to reading and integrating the sounds of words with their written form – for example, a child who cannot track words or skips lines will get lost as to where he or she is reading. A weak vestibular system can therefore contribute to learning difficulties.
p1
Cochlea as an organ of sound reception, discrimination and decoding

Where the vestibular system is a mechanism for balance and coordination, the cochlea has the seemingly simpler role of attending to all auditory information. The cochlea also controls the audio-vocal feedback mechanism – our ability to monitor and modify our own voice based on hearing ourselves speak and adapting to different contexts. People with auditory processing problems lack the listening skills to identify the appropriateness of the volume and tone of their own voices. The voice can only reproduce what the ear hears, and if sound reception for our own voices is poor, so too will be our vocal production.

The hair cells (cilia) in the cochlea are stimulated by sound. Specific hair cells (of which there are about 17,000) are attuned to particular pitches (frequencies) and respond to these with movement, transmitting information neurologically to the relevant area of the brain for interpretation. The cochlea operates like a keyboard of a piano. It is coiled in a spiral within the ear, but if it were laid flat the low frequency sounds could be seen to correspond with one end of the cochlea while the high frequency sounds correspond with the opposite end. This can be referred to as tonopic organisation – organisation based on tone. In the case of APD, some of the hair cells may not be stimulated as a result of poor processing at certain frequencies and this can disrupt the entire system of auditory processing.

The way that sound plays on a poorly functioning cochlea is similar to the way fingers play a badly tuned piano. Even though most of the keys may play normally, the few that do not will have an negative effect on the entire piece of music. Similarly, minor weakness in the cochlea can have a considerable impact on auditory processing because most sounds are complex and comprise multiple frequencies. Unless all frequencies cause appropriate cochlear stimulation, the sound will not be perceived accurately.

The sounds of language are rarely pure – that is, they are made up of a combination of frequencies known as base frequencies and higher harmonics. Some language sounds have similar base frequencies but different higher harmonics, such as ‘t’ and ‘d’. People with normal auditory perception have no problems distinguishing such sounds, but a dysfunctional cochlea can result in an inability to process these subtle differences. This impedes language acquisition and results in many of the behavioural symptoms noticed in children who have APD or related language disorders such as dyslexia.

The cochlea decodes high, medium and low frequency sound, locates the sound by pinpointing its origin, and discriminates between relevant and irrelevant information. The brain assigns meaning to the sound, which often involves a motivational component. For example, a boy playing in a noisy park may be called by his mother and despite the other environmental distractions, will both attend to his mother’s voice and turn his head in the direction of the sound.

Sound localisation draws on three types of auditory abilities: timing, intensity and frequency filtering. All of these abilities are neurological processes not within our control. Our brain chooses its method of sound localisation based on the properties of the sound.

The two main methods are those of timing and intensity. The timing method is employed when the brain is able to detect that the sound waves reached one ear faster than the other. That is, a person talking on our left will be processed a fraction of a second faster in our left ear than our right ear. The medial superior olive, located in the brainstem, receives this information from the cochlea, makes sense of this timing difference, and alerts us to the origin of the sound based on the degree of delay between the ears.

While the element of timing is appropriate for low frequency sounds, high frequency sounds are best located through the intensity method. The cochlea in the left and right ear will register differing volumes of sound based on the source of the sound. This information about volume is transmitted to the lateral superior olive, again located in the brainstem, which then tells us where the sound is coming from.

In addition to these neurological processes, each ear contributes a degree of assistance in sound localisation. The outer area of the ear (the pinna) provides a frequency filter by amplifying some sounds and reducing the volume of others. This physical structure of the pinna provides the extra information required to successfully determine the origin of a sound.

Obviously it is not enough for the cochlea to merely register a sound – the meaning of the sound must also be analysed. The brain is responsible for attaching significance and definition to sound and does so based on present expectation and prior experience. For example, Australia is considered to be a multicultural country where many people speak with an accent – therefore, people continually encounter the greeting, ‘Good morning,’ which sounds slightly different depending on the speaker’s nationality. The brain is able to decipher these different sounds as meaning the same thing by being flexible enough to adapt new information into existing frameworks. This is a tool for making learning easier.

It is clear that auditory processing is dependent on a strong relationship between the brain and the ear. Because the cochlea and the vestibule are the transmitting organs within the ear, any dysfunction in this area will have a heavy impact on higher order brain processes, especially those of language reception and expression.
Phonological awareness

Phonemes are the most basic units of sound. The English language contains 44 phonemes, which are used in combination to form words. Phonological awareness describes the ability to appreciate that language is composed of these phonemes, sequenced in millions of different ways. Most children acquire this recognition over time and this leads to speech, reading and writing development. It is this ability that children rely on to understand concepts of rhyming, word patterns, and distinguishing between similar sounds (eg. ‘pear’ and ‘bear’) and different sound emphasis (eg. ‘accent’ and ‘ascent’).

Without phonological awareness, it is very difficult for children to comprehend word patterns or apply previously learnt auditory rules to subsequent tasks. This is a fundamental part of primary school learning, which is why auditory processing difficulties are most often identified at this time.

Phonological awareness

Phonemes are the most basic units of sound. The English language contains 44 phonemes, which are used in combination to form words. Phonological awareness describes the ability to appreciate that language is composed of these phonemes, sequenced in millions of different ways. Most children acquire this recognition over time and this leads to speech, reading and writing development. It is this ability that children rely on to understand concepts of rhyming, word patterns, and distinguishing between similar sounds (eg. ‘pear’ and ‘bear’) and different sound emphasis (eg. ‘accent’ and ‘ascent’).

The mechanics of auditory processing

Sound waves travel through the ear canal and are amplified in the middle ear. The cochlea and vestibule turn the sound waves into electrical impulses, which travel to the brain through nerve pathways. When the brain receives the impulses, the meaning of the sound – the message – is interpreted. Research has demonstrated that different auditory pathways are concerned with different characteristics of sound and auditory training is able to isolate

either of these depending on an individual’s problems.

When any of the five senses is being stimulated, the information is being systematically processed. This involves the information being physically received, mentally analysed, organised and stored in the short-term or long-term memory, and subsequently retrieved for immediate or later use. The whole procedure is known as information processing, and auditory processing (as a sub-type) is no different. For people who have an auditory processing disorder (APD), the information may not be processed beyond a certain stage and is therefore unable to be made sense of and utilised.

Sensory integration

Sensory integration (SI) is a term that refers to the specific point of information processing when new information is added to the existing knowledge stored in the brain. SI is usually an unconscious ability whereby we understand a situation by combining information about what we can see, hear, taste, feel and smell. If the senses are not well integrated, situations are not properly perceived or understood. This can happen when one (or more) of the senses is overly dominant, too weak, or not interpreted effectively.

One kind of sensory integration problem is that of sensory overload. When there is SI dysfunction, such as exists with APD, the child is unable to block out non-essential background information, which results in sensory overload. Therefore, the required message does not reach the point of being processed in time to make way for the next piece of information. Such a process is fairly automatic for most children, but those with APD often get confused or left behind when auditory information is given rapidly. Unfortunately this is an unavoidable reality of school-based learning which makes it vital that such a problem is picked up as early as possible.