Loading...
Lip-reading enables the brain to synthesize auditory features of unknown silent speech
Bourguignon,Mathieu ; Baart,Martijn ; Kapnoula,Efthymia ; Molinaro,Nicola
Bourguignon,Mathieu
Baart,Martijn
Kapnoula,Efthymia
Molinaro,Nicola
Abstract
Lip-reading is crucial for understanding speech in challenging conditions. But how the brain extracts meaning from—-silent—-visual speech is still under debate. Lip-reading in silence activates the auditory cortices, but it is not known whether such activation reflects immediate synthesis of the corresponding auditory stimulus or imagery of unrelated sounds. To disentangle these possibilities, we used magnetoencephalography to evaluate how cortical activity in 28 healthy adults humans (17 females) entrained to the auditory speech envelope and lip movements (mouth opening) when listening to a spoken story without visual input (audio-only), and when seeing a silent video of a speaker articulating another story (video-only). In video-only, auditory cortical activity entrained to the absent auditory signal at frequencies below 1 Hz more than to the seen lip movements. This entrainment process was characterized by an auditory-speech—to—brain delay of ∼70 ms in the left hemisphere, compared to ∼20 ms in audio-only. Entrainment to mouth opening was found in the right angular gyrus at below 1 Hz, and in early visual cortices at 1—8 Hz. These findings demonstrate that the brain can use a silent lip-read signal to synthesize a coarse-grained auditory speech representation in early auditory cortices. Our data indicate the following underlying oscillatory mechanism: Seeing lip movements first modulates neuronal activity in early visual cortices at frequencies that match articulatory lip movements; the right angular gyrus then extracts slower features of lip movements, mapping them onto the corresponding speech sound features; this information is fed to auditory cortices, most likely facilitating speech parsing.
Description
Date
2020
Journal Title
Journal ISSN
Volume Title
Publisher
Files
Loading...
a19.pdf
Adobe PDF, 1.35 MB
Research Projects
Organizational Units
Journal Issue
Keywords
ATTENDED SPEECH, CORTEX ACTIVATION, ELECTROPHYSIOLOGICAL EVIDENCE, MEG, OSCILLATIONS, PATTERNS, PERCEPTION, RESPONSES, TRACKING, VISUAL SPEECH, audiovisual integration, lip-reading, magnetoencephalography, silent speech, speech entrainment
Citation
Bourguignon, M, Baart, M, Kapnoula, E & Molinaro, N 2020, 'Lip-reading enables the brain to synthesize auditory features of unknown silent speech', Journal of Neuroscience, vol. 40, no. 5, pp. 1053-1065. https://doi.org/10.1523/JNEUROSCI.1101-19.2019
License
info:eu-repo/semantics/openAccess
