Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Riikka Möttönen is active.

Publication


Featured researches published by Riikka Möttönen.


The Journal of Neuroscience | 2009

Motor Representations of Articulators Contribute to Categorical Perception of Speech Sounds

Riikka Möttönen; Kate E. Watkins

Listening to speech modulates activity in human motor cortex. It is unclear, however, whether the motor cortex has an essential role in speech perception. Here, we aimed to determine whether the motor representations of articulators contribute to categorical perception of speech sounds. Categorization of continuously variable acoustic signals into discrete phonemes is a fundamental feature of speech communication. We used repetitive transcranial magnetic stimulation (rTMS) to temporarily disrupt the lip representation in the left primary motor cortex. This disruption impaired categorical perception of artificial acoustic continua ranging between two speech sounds that differed in place of articulation, in that the vocal tract is opened and closed rapidly either with the lips or the tip of the tongue (/ba/–/da/ and /pa/–/ta/). In contrast, it did not impair categorical perception of continua ranging between speech sounds that do not involve the lips in their articulation (/ka/–/ga/ and /da/–/ga/). Furthermore, an rTMS-induced disruption of the hand representation had no effect on categorical perception of either of the tested continua (/ba/–da/ and /ka/–/ga/). These findings indicate that motor circuits controlling production of speech sounds also contribute to their perception. Mapping acoustically highly variable speech sounds onto less variable motor representations may facilitate their phonemic categorization and be important for robust speech perception.


Cognitive Brain Research | 2002

Processing of changes in visual speech in the human auditory cortex

Riikka Möttönen; Christina M. Krause; Kaisa Tiippana; Mikko Sams

Seeing a talkers articulatory gestures may affect the observers auditory speech percept. Observing congruent articulatory gestures may enhance the recognition of speech sounds [J. Acoust. Soc. Am. 26 (1954) 212], whereas observing incongruent gestures may change the auditory percept phonetically as occurs in the McGurk effect [Nature 264 (1976) 746]. For example, simultaneous acoustic /ba/ and visual /ga/ are usually heard as /da/. We studied cortical processing of occasional changes in audiovisual and visual speech stimuli with magnetoencephalography. In the audiovisual experiment congruent (acoustic /iti/, visual /iti/) and incongruent (acoustic /ipi/, visual /iti/) audiovisual stimuli, which were both perceived as /iti/, were presented among congruent /ipi/ (acoustic /ipi/, visual /ipi/) stimuli. In the visual experiment only the visual components of these stimuli were presented. A visual change both in audiovisual and visual experiments activated supratemporal auditory cortices bilaterally. The auditory cortex activation to a visual change occurred later in the visual than in the audiovisual experiment, suggesting that interaction between modalities accelerates the detection of visual change in speech.


Neuroscience Letters | 2004

Time course of multisensory interactions during audiovisual speech perception in humans: A magnetoencephalographic study

Riikka Möttönen; Martin Schürmann; Mikko Sams

During social interaction speech is perceived simultaneously by audition and vision. We studied interactions in the processing of auditory (A) and visual (V) speech signals in the human brain by comparing neuromagnetic responses to phonetically congruent audiovisual (AV) syllables with the arithmetic sum of responses to A and V syllables. Differences between AV and A+V responses were found bilaterally in the auditory cortices 150-200 ms and in the right superior temporal sulcus (STS) 250-600 ms after stimulus onset, showing that both sensory-specific and multisensory regions of the human temporal cortices are involved in AV speech processing. Importantly, our results suggest that AV interaction in the auditory cortex precedes that in the multisensory STS region.


Brain and Language | 2003

Cortical operational synchrony during audio-visual speech integration.

Andrew A. Fingelkurts; Alexander A. Fingelkurts; Christina M. Krause; Riikka Möttönen; Mikko Sams

Information from different sensory modalities is processed in different cortical regions. However, our daily perception is based on the overall impression resulting from the integration of information from multiple sensory modalities. At present it is not known how the human brain integrates information from different modalities into a unified percept. Using a robust phenomenon known as the McGurk effect it was shown in the present study that audio-visual synthesis takes place within a distributed and dynamic cortical networks with emergent properties. Various cortical sites within these networks interact with each other by means of so-called operational synchrony (Kaplan, Fingelkurts, Fingelkurts, & Darkhovsky, 1997). The temporal synchronization of cortical operations processing unimodal stimuli at different cortical sites reveals the importance of the temporal features of auditory and visual stimuli for audio-visual speech integration.


Human Brain Mapping | 2006

Attention to visual speech gestures enhances hemodynamic activity in the left planum temporale

Johanna Pekkola; Ville Ojanen; Taina Autti; Iiro P. Jääskeläinen; Riikka Möttönen; Mikko Sams

Observing a speakers articulatory gestures can contribute considerably to auditory speech perception. At the level of neural events, seen articulatory gestures can modify auditory cortex responses to speech sounds and modulate auditory cortex activity also in the absence of heard speech. However, possible effects of attention on this modulation have remained unclear. To investigate the effect of attention on visual speech‐induced auditory cortex activity, we scanned 10 healthy volunteers with functional magnetic resonance imaging (fMRI) at 3 T during simultaneous presentation of visual speech gestures and moving geometrical forms, with the instruction to either focus on or ignore the seen articulations. Secondary auditory cortex areas in the bilateral posterior superior temporal gyrus and planum temporale were active both when the articulatory gestures were ignored and when they were attended to. However, attention to visual speech gestures enhanced activity in the left planum temporale compared to the situation when the subjects saw identical stimuli but engaged in a nonspeech motion discrimination task. These findings suggest that attention to visually perceived speech gestures modulates auditory cortex function and that this modulation takes place at a hierarchically relatively early processing level. Hum Brain Mapp, 2005.


Cerebral Cortex | 2013

Auditory-Motor Processing of Speech Sounds

Riikka Möttönen; Rebekah Dutton; Kate E. Watkins

The motor regions that control movements of the articulators activate during listening to speech and contribute to performance in demanding speech recognition and discrimination tasks. Whether the articulatory motor cortex modulates auditory processing of speech sounds is unknown. Here, we aimed to determine whether the articulatory motor cortex affects the auditory mechanisms underlying discrimination of speech sounds in the absence of demanding speech tasks. Using electroencephalography, we recorded responses to changes in sound sequences, while participants watched a silent video. We also disrupted the lip or the hand representation in left motor cortex using transcranial magnetic stimulation. Disruption of the lip representation suppressed responses to changes in speech sounds, but not piano tones. In contrast, disruption of the hand representation had no effect on responses to changes in speech sounds. These findings show that disruptions within, but not outside, the articulatory motor cortex impair automatic auditory discrimination of speech sounds. The findings provide evidence for the importance of auditory-motor processes in efficient neural analysis of speech sounds.


Journal of Cognitive Neuroscience | 2008

Visual processing affects the neural basis of auditory discrimination

Daniel S. Kislyuk; Riikka Möttönen; Mikko Sams

The interaction between auditory and visual speech streams is a seamless and surprisingly effective process. An intriguing example is the McGurk effect: The acoustic syllable /ba/ presented simultaneously with a mouth articulating /ga/ is typically heard as /da/ [McGurk, H., & MacDonald, J. Hearing lips and seeing voices. Nature, 264, 746748, 1976]. Previous studies have demonstrated the interaction of auditory and visual streams at the auditory cortex level, but the importance of these interactions for the qualitative perception change remained unclear because the change could result from interactions at higher processing levels as well. In our electroencephalogram experiment, we combined the McGurk effect with mismatch negativity (MMN), a response that is elicited in the auditory cortex at a latency of 100250 msec by any above-threshold change in a sequence of repetitive sounds. An odd-ball sequence of acoustic stimuli consisting of frequent /va/ syllables (standards) and infrequent /ba/ syllables (deviants) was presented to 11 participants. Deviant stimuli in the unisensory acoustic stimulus sequence elicited a typical MMN, reflecting discrimination of acoustic features in the auditory cortex. When the acoustic stimuli were dubbed onto a video of a mouth constantly articulating /va/, the deviant acoustic /ba/ was heard as /va/ due to the McGurk effect and was indistinguishable from the standards. Importantly, such deviants did not elicit MMN, indicating that the auditory cortex failed to discriminate between the acoustic stimuli. Our findings show that visual stream can qualitatively change the auditory percept at the auditory cortex level, profoundly influencing the auditory cortex mechanisms underlying early sound discrimination.


The Journal of Neuroscience | 2010

Lipreading and Covert Speech Production Similarly Modulate Human Auditory-Cortex Responses to Pure Tones

Jaakko Kauramäki; Iiro P. Jääskeläinen; Riitta Hari; Riikka Möttönen; Josef P. Rauschecker; Mikko Sams

Watching the lips of a speaker enhances speech perception. At the same time, the 100 ms response to speech sounds is suppressed in the observers auditory cortex. Here, we used whole-scalp 306-channel magnetoencephalography (MEG) to study whether lipreading modulates human auditory processing already at the level of the most elementary sound features, i.e., pure tones. We further envisioned the temporal dynamics of the suppression to tell whether the effect is driven by top-down influences. Nineteen subjects were presented with 50 ms tones spanning six octaves (125–8000 Hz) (1) during “lipreading,” i.e., when they watched video clips of silent articulations of Finnish vowels /a/, /i/, /o/, and /y/, and reacted to vowels presented twice in a row; (2) during a visual control task; (3) during a still-face passive control condition; and (4) in a separate experiment with a subset of nine subjects, during covert production of the same vowels. Auditory-cortex 100 ms responses (N100m) were equally suppressed in the lipreading and covert-speech-production tasks compared with the visual control and baseline tasks; the effects involved all frequencies and were most prominent in the left hemisphere. Responses to tones presented at different times with respect to the onset of the visual articulation showed significantly increased N100m suppression immediately after the articulatory gesture. These findings suggest that the lipreading-related suppression in the auditory cortex is caused by top-down influences, possibly by an efference copy from the speech-production system, generated during both own speech and lipreading.


Aphasiology | 2012

Using TMS to study the role of the articulatory motor system in speech perception

Riikka Möttönen; Kate E. Watkins

Background: The ability to communicate using speech is a remarkable skill, which requires precise coordination of articulatory movements and decoding of complex acoustic signals. According to the traditional view, speech production and perception rely on motor and auditory brain areas, respectively. However, there is growing evidence that auditory-motor circuits support both speech production and perception. Aims: In this article we provide a review of how transcranial magnetic stimulation (TMS) has been used to investigate the excitability of the motor system during listening to speech and the contribution of the motor system to performance in various speech perception tasks. We also discuss how TMS can be used in combination with brain-imaging techniques to study interactions between motor and auditory systems during speech perception. Main contribution: TMS has proven to be a powerful tool to investigate the role of the articulatory motor system in speech perception. Conclusions: TMS studies have provided support for the view that the motor structures that control the movements of the articulators contribute not only to speech production but also to speech perception.


The Journal of Neuroscience | 2014

Attention fine-tunes auditory-motor processing of speech sounds.

Riikka Möttönen; G.M. van de Ven; Kate E. Watkins

The earliest stages of cortical processing of speech sounds take place in the auditory cortex. Transcranial magnetic stimulation (TMS) studies have provided evidence that the human articulatory motor cortex contributes also to speech processing. For example, stimulation of the motor lip representation influences specifically discrimination of lip-articulated speech sounds. However, the timing of the neural mechanisms underlying these articulator-specific motor contributions to speech processing is unknown. Furthermore, it is unclear whether they depend on attention. Here, we used magnetoencephalography and TMS to investigate the effect of attention on specificity and timing of interactions between the auditory and motor cortex during processing of speech sounds. We found that TMS-induced disruption of the motor lip representation modulated specifically the early auditory-cortex responses to lip-articulated speech sounds when they were attended. These articulator-specific modulations were left-lateralized and remarkably early, occurring 60–100 ms after sound onset. When speech sounds were ignored, the effect of this motor disruption on auditory-cortex responses was nonspecific and bilateral, and it started later, 170 ms after sound onset. The findings indicate that articulatory motor cortex can contribute to auditory processing of speech sounds even in the absence of behavioral tasks and when the sounds are not in the focus of attention. Importantly, the findings also show that attention can selectively facilitate the interaction of the auditory cortex with specific articulator representations during speech processing.

Collaboration


Dive into the Riikka Möttönen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Iiro P. Jääskeläinen

Helsinki University of Technology

View shared research outputs
Top Co-Authors

Avatar

Kaisa Tiippana

Helsinki University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Johanna Pekkola

Helsinki University of Technology

View shared research outputs
Top Co-Authors

Avatar

Taina Autti

Helsinki University of Technology

View shared research outputs
Top Co-Authors

Avatar

Ville Ojanen

Helsinki University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Christina M. Krause

Helsinki University of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge