Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Magnus Alm is active.

Publication


Featured researches published by Magnus Alm.


Journal of the Acoustical Society of America | 2009

Audio-visual identification of place of articulation and voicing in white and babble noisea)

Magnus Alm; Dawn M. Behne; Yue Wang; Ragnhild Eg

Research shows that noise and phonetic attributes influence the degree to which auditory and visual modalities are used in audio-visual speech perception (AVSP). Research has, however, mainly focused on white noise and single phonetic attributes, thus neglecting the more common babble noise and possible interactions between phonetic attributes. This study explores whether white and babble noise differentially influence AVSP and whether these differences depend on phonetic attributes. White and babble noise of 0 and -12 dB signal-to-noise ratio were added to congruent and incongruent audio-visual stop consonant-vowel stimuli. The audio (A) and video (V) of incongruent stimuli differed either in place of articulation (POA) or voicing. Responses from 15 young adults show that, compared to white noise, babble resulted in more audio responses for POA stimuli, and fewer for voicing stimuli. Voiced syllables received more audio responses than voiceless syllables. Results can be attributed to discrepancies in the acoustic spectra of both the noise and speech target. Voiced consonants may be more auditorily salient than voiceless consonants which are more spectrally similar to white noise. Visual cues contribute to identification of voicing, but only if the POA is visually salient and auditorily susceptible to the noise type.


Journal of the Acoustical Society of America | 2013

Audio-visual speech experience with age influences perceived audio-visual asynchrony in speech.

Magnus Alm; Dawn M. Behne

Previous research indicates that perception of audio-visual (AV) synchrony changes in adulthood. Possible explanations for these age differences include a decline in hearing acuity, a decline in cognitive processing speed, and increased experience with AV binding. The current study aims to isolate the effect of AV experience by comparing synchrony judgments from 20 young adults (20 to 30 yrs) and 20 normal-hearing middle-aged adults (50 to 60 yrs), an age range for which a decline of cognitive processing speed is expected to be minimal. When presented with AV stop consonant syllables with asynchronies ranging from 440 ms audio-lead to 440 ms visual-lead, middle-aged adults showed significantly less tolerance for audio-lead than young adults. Middle-aged adults also showed a greater shift in their point of subjective simultaneity than young adults. Natural audio-lead asynchronies are arguably more predictable than natural visual-lead asynchronies, and this predictability may render audio-lead thresholds more prone to experience-related fine-tuning.


Frontiers in Psychology | 2015

Do gender differences in audio-visual benefit and visual influence in audio-visual speech perception emerge with age?

Magnus Alm; Dawn M. Behne

Gender and age have been found to affect adults’ audio-visual (AV) speech perception. However, research on adult aging focuses on adults over 60 years, who have an increasing likelihood for cognitive and sensory decline, which may confound positive effects of age-related AV-experience and its interaction with gender. Observed age and gender differences in AV speech perception may also depend on measurement sensitivity and AV task difficulty. Consequently both AV benefit and visual influence were used to measure visual contribution for gender-balanced groups of young (20–30 years) and middle-aged adults (50–60 years) with task difficulty varied using AV syllables from different talkers in alternative auditory backgrounds. Females had better speech-reading performance than males. Whereas no gender differences in AV benefit or visual influence were observed for young adults, visually influenced responses were significantly greater for middle-aged females than middle-aged males. That speech-reading performance did not influence AV benefit may be explained by visual speech extraction and AV integration constituting independent abilities. Contrastingly, the gender difference in visually influenced responses in middle adulthood may reflect an experience-related shift in females’ general AV perceptual strategy. Although young females’ speech-reading proficiency may not readily contribute to greater visual influence, between young and middle-adulthood recurrent confirmation of the contribution of visual cues induced by speech-reading proficiency may gradually shift females AV perceptual strategy toward more visually dominated responses.


Journal of the Acoustical Society of America | 2014

Age mitigates the correlation between cognitive processing speed and audio-visual asynchrony detection in speech.

Magnus Alm; Dawn M. Behne

Cognitive processing speed, hearing acuity, and audio-visual (AV) experience have been suggested to influence AV asynchrony detection. Whereas the influence of hearing acuity and AV experience have been explored to some extent, the influence of cognitive processing speed on perceived AV asynchrony has not been directly tested. Therefore, the current study investigates the relationship between cognitive processing speed and AV asynchrony detection in speech and, with hearing acuity controlled, assesses whether age-related AV experience mitigates the strength of this relationship. The cognitive processing speed and AV asynchrony detection by 20 young adults (20-30 years) and 20 middle-aged adults (50-60 years) were measured using auditory, visual and AV recognition reaction time tasks, and an AV synchrony judgment task. Strong correlations between audio, visual, and AV reaction times and AV synchrony window size were found for young adults, but not for middle-aged adults. These findings suggest that although cognitive processing speed influences AV asynchrony detection in speech, the strength of the relationship is seemingly reduced by AV experience.


Journal of the Acoustical Society of America | 2013

Effects of musical experience on perception of audiovisual synchrony for speech and music

Dawn M. Behne; Magnus Alm; Aleksander Berg; Thomas Engell; Camilla Foyn; Canutte Johnsen; Thulasy Srigaran; Ane Eir Torsdottir

Perception of audiovisual synchrony relies on matching temporal attributes across sensory modalities. To investigate the influence of experience on cross-modal temporal integration, the effect of musical experience on the perception of audiovisual synchrony was studied with speech and music stimuli. Nine musicians and nine non-musicians meeting strict group criteria provided simultaneity judgments to audiovisual /ba/ and guitar-strum stimuli, each with 23 levels of audiovisual alignment. Although results for the speech and music stimuli differed, the two groups did not differ in their responses to the two types of stimuli. Consistent with previous research, responses from both groups show less temporal sensitivity to stimuli with video-lead than audio-lead. No significant between-group difference was found for video-lead thresholds. However, both for the speech and music stimuli, musicians had an audio-lead threshold significantly closer to the point of physical synchrony than non-musicians, indicating th...


Journal of the Acoustical Society of America | 2008

Identification of place of articulation and voicing in white and babble noise

Magnus Alm; Dawn M. Behne

Previous research shows that white noise influences the degree auditory and visual modalities are used in audio‐visual (AV) speech perception. This study assesses identification of voicing and place of articulation (POA) in the infrequently studied natural babble noise. Incongruent monosyllabic AV stimuli were presented to 15 young adults in white and babble noise at 0 and ‐12 dB SNR where voicing stimuli differed in voicing and voicing structure and POA stimuli differed in POA and POA structure. In white noise POA stimuli received fewer audio responses than in babble whereas voicing stimuli received more audio responses in white noise than in babble. Voiced syllables received more audio responses than voiceless. Findings suggest that differences in noise type for POA and voicing identification are attributable to discrepancies in acoustical attributes of noise and target stimuli. In voicing identification, the spectral transition between aspiration (stabile signal) and voicing (fluctuating signal) is less distinct in the fluctuating babble noise than in the flat power white noise. Voiceless consonants are more spectrally similar to white noise than voiced consonants, making the latter more auditorily accessible. Visual cues aid voicing identification, but only when POA is visually salient and auditorily susceptible to the noise type.


Journal of the Acoustical Society of America | 2006

Audiovisual perception of voicing with age in quiet and cafe noise

Dawn M. Behne; Yue Wang; Magnus Alm; Ingrid Arntsen; Ragnhild Eg; Ane Valsø

Research has shown that voicing is difficult to discern in noisy environments. While voicing may be difficult to resolve from visual cues, acoustic cues for voicing are relatively robust. This study addresses these factors with normally aging audiovisual perception. Identification responses were gathered with 19–30‐year‐old and 49–60‐year‐old adults for audiovisual (AV) CVs differing in voicing and consonant place of articulation. Materials were presented in quiet and in cafe noise (SNR=0 dB) as audio‐only (A), visual‐only (V), congruent AV, and incongruent AV. Results show a tendency toward use of visual information with age and noise for consonant place of articulation. Notably for voicing, incongruent AV materials that had one voiced component, regardless if it was A or V that was voiced, were consistently perceived as voiced in both age groups and regardless of noise. Only if the A and V components were both voiceless was the syllable perceived as voiceless. These findings indicate the influence of ag...


AVSP | 2007

Changes in audio-visual speech perception during adulthood

Dawn M. Behne; Yue Wang; Magnus Alm; Ingrid Arntsen; Ragnhild Eg; Ane Valsø


conference of the international speech communication association | 2008

Voicing influences the saliency of place of articulation in audio-visual speech perception in babble.

Magnus Alm; Dawn M. Behne


The 14th International Conference on Auditory-Visual Speech Processing | 2017

Perceived Audiovisual Simultaneity in Speech by Musicians and Nonmusicians: Preliminary Behavioral and Event-Related Potential (ERP) Findings.

Dawn M. Behne; Marzieh Sorati; Magnus Alm

Collaboration


Dive into the Magnus Alm's collaboration.

Top Co-Authors

Avatar

Dawn M. Behne

Norwegian University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Ragnhild Eg

Simula Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Yue Wang

Simon Fraser University

View shared research outputs
Researchain Logo
Decentralizing Knowledge