The cortical organization of audio-visual sentence comprehension: an fMRI study at 4 Tesla.
Capek CM., Bavelier D., Corina D., Newman AJ., Jezzard P., Neville HJ.
Neuroimaging studies of written and spoken sentence processing report greater left hemisphere than right hemisphere activation. However, a large majority of our experience with language is face-to-face interaction, which is much richer in information. The current study examines the neural organization of audio-visual (AV) sentence processing using functional magnetic resonance imaging (fMRI) at 4 Tesla. Participants viewed the face and upper body of a speaker via a video screen while listening to her produce, in alternating blocks, English sentences and sentences composed of pronounceable non-words. Audio-visual sentence processing was associated with activation in the left hemisphere in Broca's area, dorsolateral prefrontal cortex, the superior precentral sulcus, anterior and middle portions of the lateral sulcus, middle superior portions of the temporal sulcus, supramarginal gyrus and angular gyrus. Further, AV sentence processing elicited activation in the right anterior and middle lateral sulcus. Between-hemisphere analyses revealed a left hemisphere dominant pattern of activation. The findings support the hypothesis that the left hemisphere may be biased to process language independently of the modality through which it is perceived. These results are discussed in the context of previous neuroimaging results using American Sign Language (ASL).