Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kaisa Tiippana is active.

Publication


Featured researches published by Kaisa Tiippana.


Cognitive Brain Research | 2002

Processing of changes in visual speech in the human auditory cortex

Riikka Möttönen; Christina M. Krause; Kaisa Tiippana; Mikko Sams

Seeing a talkers articulatory gestures may affect the observers auditory speech percept. Observing congruent articulatory gestures may enhance the recognition of speech sounds [J. Acoust. Soc. Am. 26 (1954) 212], whereas observing incongruent gestures may change the auditory percept phonetically as occurs in the McGurk effect [Nature 264 (1976) 746]. For example, simultaneous acoustic /ba/ and visual /ga/ are usually heard as /da/. We studied cortical processing of occasional changes in audiovisual and visual speech stimuli with magnetoencephalography. In the audiovisual experiment congruent (acoustic /iti/, visual /iti/) and incongruent (acoustic /ipi/, visual /iti/) audiovisual stimuli, which were both perceived as /iti/, were presented among congruent /ipi/ (acoustic /ipi/, visual /ipi/) stimuli. In the visual experiment only the visual components of these stimuli were presented. A visual change both in audiovisual and visual experiments activated supratemporal auditory cortices bilaterally. The auditory cortex activation to a visual change occurred later in the visual than in the audiovisual experiment, suggesting that interaction between modalities accelerates the detection of visual change in speech.


European Journal of Cognitive Psychology | 2004

Visual attention modulates audiovisual speech perception

Kaisa Tiippana; Tobias S. Andersen; Mikko Sams

Speech perception is audiovisual, as demonstrated by the McGurk effect in which discrepant visual speech alters the auditory speech percept. We studied the role of visual attention in audiovisual speech perception by measuring the McGurk effect in two conditions. In the baseline condition, attention was focused on the talking face. In the distracted attention condition, subjects ignored the face and attended to a visual distractor, which was a leaf moving across the face. The McGurk effect was weaker in the latter condition, indicating that visual attention modulated audiovisual speech perception. This modulation may occur at an early, unisensory processing stage, or it may be due to changes at the stage where auditory and visual information is integrated. We investigated this issue by conventional statistical testing, and by fitting the Fuzzy Logical Model of Perception (Massaro, 1998) to the results. The two methods suggested different interpretations, revealing a paradox in the current methods of analysis.


Speech Communication | 2009

The role of visual spatial attention in audiovisual speech perception

Tobias S. Andersen; Kaisa Tiippana; Jari Laarni; Ilpo Kojo; Mikko Sams

Auditory and visual information is integrated when perceiving speech, as evidenced by the McGurk effect in which viewing an incongruent talking face categorically alters auditory speech perception. Audiovisual integration in speech perception has long been considered automatic and pre-attentive but recent reports have challenged this view. Here we study the effect of visual spatial attention on the McGurk effect. By presenting a movie of two faces symmetrically displaced to each side of a central fixation point and dubbed with a single auditory speech track, we were able to discern the influences from each of the faces and from the voice on the auditory speech percept. We found that directing visual spatial attention towards a face increased the influence of that face on auditory perception. However, the influence of the voice on auditory perception did not change suggesting that audiovisual integration did not change. Visual spatial attention was also able to select between the faces when lip reading. This suggests that visual spatial attention acts at the level of visual speech perception prior to audiovisual integration and that the effect propagates through audiovisual integration to influence auditory perception.


Neuroscience Letters | 2005

Maximum Likelihood Integration of rapid flashes and beeps

Tobias S. Andersen; Kaisa Tiippana; Mikko Sams

Maximum likelihood models of multisensory integration are theoretically attractive because the goals and assumptions of sensory information processing are explicitly stated in such optimal models. When subjects perceive stimuli categorically, as opposed to on a continuous scale, Maximum Likelihood Integration (MLI) can occur before or after categorization-early or late. We introduce early MLI and apply it to the audiovisual perception of rapid beeps and flashes. We compare it to late MLI and show that early MLI is a better fitting and more parsimonious model. We also show that early MLI is better able to account for the effects of information reliability, modality appropriateness and intermodal attention which affect multisensory perception.


Vision Research | 1999

Spatial-frequency bandwidth of perceived contrast

Kaisa Tiippana; Risto Näsänen

The aim of this study was to investigate the spatial-frequency bandwidth of perceived suprathreshold contrast. It has been shown that for grating stimuli contrast detection thresholds depend on spatial frequency, grating area and the number of orientation components. However, suprathreshold contrast perception exhibits contrast constancy, i.e. suprathreshold contrast matches are independent of these stimulus parameters. To study whether contrast constancy applies to spatial-frequency bandwidth, contrast matching was performed and detection thresholds were measured for spatial noise stimuli at various bandwidths centred at 2 c/deg. At high contrast levels, contrast matches were nearly independent of stimulus spatial-frequency bandwidth up to about 6 octaves, even though detection thresholds increased with bandwidth. Thus, a broad band of spatial frequencies contributed to perceived suprathreshold contrast. The requisites for this are contrast constancy with respect to spatial frequency, and integration of contrast information across different spatial frequencies so that the effective bandwidth of the system is broad.


Vision Research | 2000

Contrast matching across spatial frequencies for isoluminant chromatic gratings

Kaisa Tiippana; Jyrki Rovamo; Risto Näsänen; David Whitaker; Pia Mäkelä

Contrast matching was performed with isoluminant red-green and s-cone gratings at spatial frequencies ranging from 0.5 to 8 c/deg. Contrast threshold curves were low-pass in shape, in agreement with previous findings. Contrast matching functions resembled threshold curves at low contrast levels, but became flat and independent of spatial frequency at high contrasts. Thus, isoluminant chromatic gratings exhibited contrast constancy at suprathreshold contrast levels in a similar manner as has been demonstrated for achromatic gratings.


Cognitive Brain Research | 2004

Factors influencing audiovisual fission and fusion illusions

Tobias S. Andersen; Kaisa Tiippana; Mikko Sams


Neuropsychologia | 2008

Impaired recognition of facial emotions from low-spatial frequencies in Asperger syndrome

Jari Kätsyri; Satu Saalasti; Kaisa Tiippana; Lennart von Wendt; Mikko Sams


Neuroscience Letters | 2003

Integration of heard and seen speech: a factor in learning disabilities in children

Erin Hayes; Kaisa Tiippana; Trent Nicol; Mikko Sams; Nina Kraus


AVSP | 2001

Visual attention influences audiovisual speech perception.

Kaisa Tiippana; Mikko Sams; Tobias S. Andersen

Collaboration


Dive into the Kaisa Tiippana's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tobias S. Andersen

Helsinki University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mitesh Bhudia

Anglia Ruskin University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Christina M. Krause

Helsinki University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lennart von Wendt

Helsinki University Central Hospital

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge