Hearing Loss Disrupts Speech Coordination

by time news usa

Summary: A recent study reveals that hearing plays a vital role in coordinating speech movements. Researchers found that when individuals briefly couldn’t hear their own speech, their ability to control their jaw and tongue movements declined.

This discovery is particularly important for understanding speech production in people with hearing loss, including those using cochlear implants. The findings may lead to new therapeutic strategies focused on oral-motor training for individuals with hearing impairments.

Key Facts:

  • Hearing loss impairs real-time coordination of speech movements.
  • People may rely more on oral-motor feedback when auditory feedback is reduced.
  • New therapies could improve speech for those with hearing loss or cochlear implants.

A McGill University study has shown that hearing plays a crucial role in how people coordinate and control speech movements in real-time.

Published in The Journal of the Acoustical Society of America, the research shows that when people cannot hear their own speech, even briefly, their ability to move their jaw and tongue in a coordinated manner is impaired.

This finding has significant implications for understanding speech production in people with hearing loss, especially those using cochlear implants. Credit: Neuroscience News

“People rely on immediate auditory feedback to coordinate and control the movements of their vocal tract in service to speech production,” said Matthew Masapollo, lead author of the paper, who conducted the study while working as a Research Associate in McGill’s Motor Neuroscience Laboratory.

The team used electromagnetic articulography (EMA) to track jaw and tongue-tip movements during speaking in people with normal hearing under two conditions: when they could hear their speech and when it was masked with multi-talker noise.

In the latter scenario, where participants briefly couldn’t hear themselves, speech motor performance declined.

This finding has significant implications for understanding speech production in people with hearing loss, especially those using cochlear implants.

“Some aspects of speech production remain impaired, even years after implantation, undoubtedly because the auditory signals available through CIs are degraded,” said Masapollo.

Understanding how bad sound affects speech helps ensure cochlear implants are effective and guides how to help children with serious hearing loss learn to speak, the researchers noted.

Masapollo, in collaboration with Susan Nittrouer and McGill professors David J. Ostry, and Lucie Ménard, is now investigating how reduced sound access through cochlear implants affects speech produced by individuals who received cochlear implants.

The preliminary findings suggest that people with hearing loss might rely more on how their mouth and tongue feel, rather than auditory feedback, to control speech movements.

If confirmed, clinical research will be able to capitalize on this by developing new therapeutic interventions focused on oral-motor training to assist children and adults with hearing loss.

About this auditory neuroscience research news

Original Research: Closed access.
Immediate auditory feedback regulates inter-articulator speech coordination in service to phonetic structure” by Matthew Masapollo et al. Journal of the Acoustical Society of America


Abstract

Immediate auditory feedback regulates inter-articulator speech coordination in service to phonetic structure

Research has shown that talkers reliably coordinate the timing of articulator movements across variation in production rate and syllable stress, and that this precision of inter-articulator timing instantiates phonetic structure in the resulting acoustic signal.

We here tested the hypothesis that immediate auditory feedback helps regulate that consistent articulatory timing control.

Talkers with normal hearing recorded 480 /tV#Cat/ utterances using electromagnetic articulography, with alternative V (/ɑ/-/ɛ/) and C (/t/-/d/), across variation in production rate (fast-normal) and stress (first syllable stressed-unstressed). Utterances were split between two listening conditions: unmasked and masked.

To quantify the effect of immediate auditory feedback on the coordination between the jaw and tongue-tip, the timing of tongue-tip raising onset for C, relative to the jaw opening-closing cycle for V, was obtained in each listening condition.

Across both listening conditions, any manipulation that shortened the jaw opening-closing cycle reduced the latency of tongue-tip movement onset, relative to the onset of jaw opening. Moreover, tongue-tip latencies were strongly affiliated with utterance type.

During auditory masking, however, tongue-tip latencies were less strongly affiliated with utterance type, demonstrating that talkers use afferent auditory signals in real-time to regulate the precision of inter-articulator timing in service to phonetic structure.

You may also like

Leave a Comment