Auditory and visual word processing studied with fMRI

Hum Brain Mapp. 1999;7(1):15-28. doi: 10.1002/(SICI)1097-0193(1999)7:1<15::AID-HBM2>3.0.CO;2-6.

Abstract

Brain activations associated with semantic processing of visual and auditory words were investigated using functional magnetic resonance imaging (fMRI). For each form of word presentation, subjects performed two tasks: one semantic, and one nonsemantic. The semantic task was identical for both auditory and visual presentation: single words were presented and subjects determined whether the word was concrete or abstract. In the nonsemantic task for auditory words, subjects determined whether the word had one syllable or multiple syllables. In the nonsemantic task for visual words, subjects determined whether the word was presented in lower case or upper case. There was considerable overlap in where auditory and visual word semantic processing occurred. Visual and auditory semantic tasks both activated the left inferior frontal (BA 45), bilateral anterior prefrontal (BA 10, 46), and left premotor regions (BA 6) and anterior SMA (BA 6, 8). Left posterior temporal (middle temporal and fusiform gyrus) and predominantly right-sided cerebellar activations were observed during the auditory semantic task but were not above threshold during visual word presentation. The data, when averaged across subjects, did not show obligatory activation of left inferior frontal and temporal language areas during nonsemantic word tasks. Individual subjects showed differences in the activation of the inferior frontal region while performing the same task, even though they showed similar response latency and accuracy.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Acoustic Stimulation
  • Adult
  • Brain / anatomy & histology
  • Brain / physiology*
  • Brain Mapping / methods*
  • Humans
  • Language*
  • Magnetic Resonance Imaging / methods
  • Photic Stimulation
  • Reading*
  • Speech*