Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Joost X. Maier is active.

Publication


Featured researches published by Joost X. Maier.


The Journal of Neuroscience | 2005

Multisensory Integration of Dynamic Faces and Voices in Rhesus Monkey Auditory Cortex

Asif A. Ghazanfar; Joost X. Maier; Kari L. Hoffman; Nk Logothetis

In the social world, multiple sensory channels are used concurrently to facilitate communication. Among human and nonhuman primates, faces and voices are the primary means of transmitting social signals (Adolphs, 2003; Ghazanfar and Santos, 2004). Primates recognize the correspondence between species-specific facial and vocal expressions (Massaro, 1998; Ghazanfar and Logothetis, 2003; Izumi and Kojima, 2004), and these visual and auditory channels can be integrated into unified percepts to enhance detection and discrimination. Where and how such communication signals are integrated at the neural level are poorly understood. In particular, it is unclear what role “unimodal” sensory areas, such as the auditory cortex, may play. We recorded local field potential activity, the signal that best correlates with human imaging and event-related potential signals, in both the core and lateral belt regions of the auditory cortex in awake behaving rhesus monkeys while they viewed vocalizing conspecifics. We demonstrate unequivocally that the primate auditory cortex integrates facial and vocal signals through enhancement and suppression of field potentials in both the core and lateral belt regions. The majority of these multisensory responses were specific to face/voice integration, and the lateral belt region shows a greater frequency of multisensory integration than the core region. These multisensory processes in the auditory cortex likely occur via reciprocal interactions with the superior temporal sulcus.


Current Biology | 2007

Vocal-tract resonances as indexical cues in rhesus monkeys.

Asif A. Ghazanfar; Hjalmar K. Turesson; Joost X. Maier; Ralph van Dinther; Roy D. Patterson; Nk Logothetis

Summary Vocal-tract resonances (or formants) are acoustic signatures in the voice and are related to the shape and length of the vocal tract. Formants play an important role in human communication, helping us not only to distinguish several different speech sounds [1], but also to extract important information related to the physical characteristics of the speaker, so-called indexical cues. How did formants come to play such an important role in human vocal communication? One hypothesis suggests that the ancestral role of formant perception—a role that might be present in extant nonhuman primates—was to provide indexical cues [2–5]. Although formants are present in the acoustic structure of vowel-like calls of monkeys [3–8] and implicated in the discrimination of call types [8–10], it is not known whether they use this feature to extract indexical cues. Here, we investigate whether rhesus monkeys can use the formant structure in their “coo” calls to assess the age-related body size of conspecifics. Using a preferential-looking paradigm [11, 12] and synthetic coo calls in which formant structure simulated an adult/large- or juvenile/small-sounding individual, we demonstrate that untrained monkeys attend to formant cues and link large-sounding coos to large faces and small-sounding coos to small faces—in essence, they can, like humans [13], use formants as indicators of age-related body size.


Neuron | 2004

Multisensory Integration of Looming Signals by Rhesus Monkeys

Joost X. Maier; John G. Neuhoff; Nk Logothetis; Asif A. Ghazanfar

Looming objects produce ecologically important signals that can be perceived in both the visual and auditory domains. Using a preferential looking technique with looming and receding visual and auditory stimuli, we examined the multisensory integration of looming stimuli by rhesus monkeys. We found a strong attentional preference for coincident visual and auditory looming but no analogous preference for coincident stimulus recession. Consistent with previous findings, the effect occurred only with tonal stimuli and not with broadband noise. The results suggest an evolved capacity to integrate multisensory looming objects.


Current Biology | 2008

Integration of Bimodal Looming Signals through Neuronal Coherence in the Temporal Lobe

Joost X. Maier; Chandramouli Chandrasekaran; Asif A. Ghazanfar

The ability to integrate information across multiple sensory systems offers several behavioral advantages, from quicker reaction times and more accurate responses to better detection and more robust learning. At the neural level, multisensory integration requires large-scale interactions between different brain regions--the convergence of information from separate sensory modalities, represented by distinct neuronal populations. The interactions between these neuronal populations must be fast and flexible, so that behaviorally relevant signals belonging to the same object or event can be immediately integrated and integration of unrelated signals can be prevented. Looming signals are a particular class of signals that are behaviorally relevant for animals and that occur in both the auditory and visual domain. These signals indicate the rapid approach of objects and provide highly salient warning cues about impending impact. We show here that multisensory integration of auditory and visual looming signals may be mediated by functional interactions between auditory cortex and the superior temporal sulcus, two areas involved in integrating behaviorally relevant auditory-visual signals. Audiovisual looming signals elicited increased gamma-band coherence between these areas, relative to unimodal or receding-motion signals. This suggests that the neocortex uses fast, flexible intercortical interactions to mediate multisensory integration.


The Journal of Neuroscience | 2009

Natural, Metaphoric, and Linguistic Auditory Direction Signals Have Distinct Influences on Visual Motion Processing

Sepideh Sadaghiani; Joost X. Maier; Uta Noppeney

To interact with our dynamic environment, the brain merges motion information from auditory and visual senses. However, not only “natural” auditory MOTION, but also “metaphoric” de/ascending PITCH and SPEECH (e.g., “left/right”), influence the visual motion percept. Here, we systematically investigate whether these three classes of direction signals influence visual motion perception through shared or distinct neural mechanisms. In a visual-selective attention paradigm, subjects discriminated the direction of visual motion at several levels of reliability, with an irrelevant auditory stimulus being congruent, absent, or incongruent. Although the natural, metaphoric, and linguistic auditory signals were equally long and adjusted to induce a comparable directional bias on the motion percept, they influenced visual motion processing at different levels of the cortical hierarchy. A significant audiovisual interaction was revealed for MOTION in left human motion complex (hMT+/V5+) and for SPEECH in right intraparietal sulcus. In fact, the audiovisual interaction gradually decreased in left hMT+/V5+ for MOTION > PITCH > SPEECH and in right intraparietal sulcus for SPEECH > PITCH > MOTION. In conclusion, natural motion signals are integrated in audiovisual motion areas, whereas the influence of culturally learnt signals emerges primarily in higher-level convergence regions.


The Journal of Neuroscience | 2007

Looming Biases in Monkey Auditory Cortex

Joost X. Maier; Asif A. Ghazanfar

Looming signals (signals that indicate the rapid approach of objects) are behaviorally relevant signals for all animals. Accordingly, studies in primates (including humans) reveal attentional biases for detecting and responding to looming versus receding signals in both the auditory and visual domains. We investigated the neural representation of these dynamic signals in the lateral belt auditory cortex of rhesus monkeys. By recording local field potential and multiunit spiking activity while the subjects were presented with auditory looming and receding signals, we show here that auditory cortical activity was biased in magnitude toward looming versus receding stimuli. This directional preference was not attributable to the absolute intensity of the sounds nor can it be attributed to simple adaptation, because white noise stimuli with identical amplitude envelopes did not elicit the same pattern of responses. This asymmetrical representation of looming versus receding sounds in the lateral belt auditory cortex suggests that it is an important node in the neural network correlate of looming perception.


Journal of Experimental Psychology: Human Perception and Performance | 2011

Audiovisual Asynchrony Detection in Human Speech

Joost X. Maier; Massimiliano Di Luca; Uta Noppeney

Combining information from the visual and auditory senses can greatly enhance intelligibility of natural speech. Integration of audiovisual speech signals is robust even when temporal offsets are present between the component signals. In the present study, we characterized the temporal integration window for speech and nonspeech stimuli with similar spectrotemporal structure to investigate to what extent humans have adapted to the specific characteristics of natural audiovisual speech. We manipulated spectrotemporal structure of the auditory signal, stimulus length, and task context. Results indicate that the temporal integration window is narrower and more asymmetric for speech than for nonspeech signals. When perceiving audiovisual speech, subjects tolerate visual leading asynchronies, but are nevertheless very sensitive to auditory leading asynchronies that are less likely to occur in natural speech. Thus, speech perception may be fine-tuned to the natural statistics of audiovisual speech, where facial movements always occur before acoustic speech articulation.


Seeing and Perceiving | 2012

Different classes of audiovisual correspondences are processed at distinct levels of the cortical hierarchy

Uta Noppeney; Ruth Adam; Sepideh Sadaghiani; Joost X. Maier; Hwee Ling Lee; S Werner; Dirk Ostwald; R Lewis; Conrad

The brain should integrate sensory inputs only when they emanate from a common source and segregate those from different sources. Sensory correspondences are important cues informing the brain whether two sensory inputs are generated by a common event and should hence be integrated. Most prominently, sensory inputs should co-occur in time and space. More complex audiovisual stimuli may also be congruent in terms of semantics (e.g., objects and source sounds) or phonology (e.g., spoken and written words; linked via common linguistic labels). Surprisingly, metaphoric relations (e.g., pitch and height) have also been shown to influence audiovisual integration. The neural mechanisms that mediate these metaphoric congruency effects are only poorly understood. They may be mediated via (i) natural multisensory binding, (ii) common linguistic labels or (iii) semantics. In this talk, we will present a series of studies that investigate whether these different types of audiovisual correspondences are processed by distinct neural systems. Further, we investigate how those systems are employed by metaphoric audiovisual correspondences. Our results demonstrate that different classes of audiovisual correspondences influence multisensory integration at distinct levels of the cortical hierarchy. Spatiotemporal incongruency is detected already at the primary cortical level. Natural (e.g., motion direction) and phonological incongruency influences MSI in areas involved in motion or phonological processing. Critically, metaphoric interactions emerge in neural systems that are shared with natural and semantic incongruency. This activation pattern may reflect the ambivalent nature of metaphoric audiovisual interactions relying on both natural and semantic correspondences.


9th International Multisensory Research Forum (IMRF 2008) | 2008

Natural, metaphoric and linguistic auditory-visual interactions

Sepideh Sadaghiani; Joost X. Maier; Uta Noppeney


6th International Multisensory Research Forum (IMRF 2005) | 2005

Multisensory processing of looming signals in primates

Joost X. Maier; Nk Logothetis; Asif A. Ghazanfar

Collaboration


Dive into the Joost X. Maier's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Uta Noppeney

University of Birmingham

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge