Neurophysiological research

Researchers: Christina Krause, Mikko Sams, Toni Auranen, Alex and Andrew Fingelkurts, Vasily Klucharev, Veera Meriläinen, Riikka Möttönen, Ville Ojanen, Jukka Saari, Kaisa Tiippana

The neurophysiological studies conducted in our laboratory focus mainly on the neural processing of heard and seen speech. One of our aims is to find out in which brain areas, and at which stage of information processing, acoustic and visual features of speech are integrated. Experiments have been conducted using the 306-channel neuromagnetometers in the Low Temperature Laboratory of the Helsinki University of Technology and in BioMag Laboratory in Helsinki University Hospital. Typically, we examine the evoked responses as well as brain oscillatory responses to various types of audiovisual stimuli, utilizing a variety of experimental paradigms. We have recently studied how the human brain processes changes in audiovisual and visual speech. We have shown that the change detection mechanisms in the human auditory cortex can distinguish between visual speech inputs, not only between acoustic inputs. Furthermore, a visual change in audiovisual stimulation elicits earlier activation in the auditory cortex than a change in purely visual stimulation. We have also found that an acoustically and visually deviant stimulus elicits a beta rhythm response ( 20 Hz rebound), whereas only visually deviant audiovisual stimulus do not. This finding suggests that the motor cortices may be involved in the processing of audiovisual speech.

Figure 40

Figure 40: Time-Frequency Representations (TFR) over the parietal area from a memory task in four modalities. The upper row shows the TFRs obtained during memory encoding (four consecutive items to be held in memory) and the lower row displays the TFRs during memory retrieval (one item is compared with the four previous items). Warm colors denote relative power increase with respect to a pre-stimulus baseline whereas cold colors stand for power decrease. Encoding and retrieval of vowels presented in different modalities evoke clearly different TFRs.

Recently, brain oscillatory responses during audiovisual information processing were examined also during a multimodal memory task. In this experiment, brain oscillatory responses were assessed during encoding and retrieval of acoustically, visually and audiovisually presented vowels. The preliminary findings suggest differences in brain oscillatory responses in the theta, alpha and beta frequencies between the encoding and retrieval of material presented in the visual, auditory and audiovisual stimulus modalities. The modality in which information is presented seems to alter brain oscillatory activity differently depending on the cognitive task applied. We aim at distinguishing stimulus- and task-related brain oscillatory responses from each other.